In 2003 Warner Brothers released an anthology collection called the Animatrix.
In it, the Wachowski sisters present a narrative where society as a whole becomes rich, decadent, and lazy through the use of automation and artificial intelligence. It results in humanity's downfall and enslavement to machines.
This was, of course, a wildly optimistic vision of the future.
Humans have been automating tasks more or less since we started using tools. In the last couple hundred years however the rate of automation has seen an increasingly rapid development. Even desirable jobs once considered untouchable by computers or AI, like lawyers, artists, writers, and teachers are being encroached upon by the growing sophistication of AI. Which can provide a wide array of benefits for many people but also
really sucks for wages and livelihoods dictated by a capitalistic hellscape.
Talk about automation and AI's effects on society, the economy, etc here. For the love of which ever god try to be civil about it. Odds are the people you're talking to are, or already have, experienced what might be a new and unexpected development to you.
Posts
And you thought Roko's Basilisk was stupid.
Edit: This is a brilliant response:
"Mr. President, we've discovered the aliens' weakness. They are vulnerable to the n-word. But since The Wokeness Initiative, none of us remember it!"
Incels: "I was made for this moment."
Most wildly successful humans are already trading in cultural cachet and curation of aesthetic than in creating works of arts themselves.
The dystopian robot-laborer future is about influencing and experiencing and not about making things that you need. Unless making things is part of some desirable self concept and aesthetic.
The myth of the sole artist creating Great Works instead of most art being the product of teams executing a shared hallucination and trying to get it into a firm medium is silly. Is the game producer or art director who signed off on an artists's concept doing something different than someone signing off on AI proposed concepts?
I think we need to focus on the near term future for now, it's just hilariously impossible to predict what the world looks like decades from now if AI continues to develop at this pace. At the moment, none of these products are good enough to cause a major shift the workforce. It remains to be seen if they have the potential to collectively evolve into something resembling general AI, or if we're just getting overexcited by a stochastic parrot.
I'm actually most interested in the medical implications of AI. There have been a bunch of examples of this already: https://www.nytimes.com/2023/01/09/science/artificial-intelligence-proteins.html
What if AI-supported research results in major pharmaceutical breakthroughs? My single biggest hope for future medicine is for us to actually understand the human brain, and find a cure for most mental illnesses. I think that's probably impossible for humans. But AI is excellent at finding patterns in data sets that are illegible to a human.
The concern with AI really isn't that we can automate tasks, as pointed out in the OP, we've been doing that shit ever since we figured out the concept of tools. In fact, there isn't anything wrong with automating tasks. There are some tasks that are pretty shitty and no one wants to do them, tasks where there just aren't enough people to get everything done and other tasks that a machine just does a better job. The issue is how we have structured our society.
If we lived in a society where people were guaranteed reasonable food, shelter, healthcare and access to entertainment. Then there wouldn't be an issues, but in our capitalistic hellscape people are forced to toil for shitheads to get things that should guaranteed, so that they can survive. Well automation becomes a huge problem because each task that can be automatic is jobs that people can't take anymore. Eventually, there is a risk we run out of jobs that people can take because everything is automated. Worse, we don't have to hit a point where all the jobs have been automated for parasite class to push automation levels to a point where we run into major social unrest.
One reason why tempers probably flare over AI aren't, really isn't about what constitutes art. Rather it's the whole concern that a bunch of individuals, many of whom aren't paid well to begin with, will be stuck out on the street with no means to support themselves at worst and at best forced to work really shitty jobs, with really shitty wages and benefits and probably bullshit hours, Not saying that the discussion would stay completely civil because there are some truly insufferable people on both sides of that discussion, though it would likely be much more civil if people didn't have to fear dying on the street in a land of plenty because they weren't getting a paycheck to enable them to buy anything.
I'm likeliest to move on that, it's a test balloon for whether the argument is durable. It's a weird engagement to paint me as unmoveable on something where Im engaging entirely with your premise.
But the art AI is probably a thread killer discussion anyway
Or rather, it already has a thread
In terms of a cultural shock, it's about akin to alien first contact. We would suddenly not be alone, and we would have made it ourselves.
People complaining about the human aspects of it will lose their minds. Of course, first the AI will have to convincingly smash the ever moving goalposts of "its not real intelligence", but one will arrive that does, with no limits, and indeed, less limits than a human. There may very well be one that can analyse the person its talking to an choose exactly the right kind of argument or method of proof that convinces that person.
Right now, wondering about "meaning" is extremely premature. These things aren't there yet. There are practical problems like people suddenly not being able to pay rent or eat that are much bigger deals.
Cross that other bridge when we come to it. Worrying about it first isn't really going to help you acclimatise to it. It will still be a massive shock regardless. Nobody is truly prepared. The myth of the fallen angel meets the rising ape, and the mystical specialness of people, is intrinsic in all but the most insanely hardcore determinists. It's an unconscious bedrock in most of the worlds current cultures.
3DS: 0473-8507-2652
Switch: SW-5185-4991-5118
PSN: AbEntropy
What does matter and is real, is that we are getting better and better at making AIs that can perform more and more task that were once only things humans can do and with that comes the reality that people are losing their jobs and ability to support themselves. In fact, last I checked, we had a fuck ton of jobs that involved transportation and to briefly cross streams, truly autonomous vehicles for transportation of goods and people, is probably the first major risk in advancement of AI, that will put us in a situation where you have a fuck ton of social unrest from the job losses to automation. It's not jut a massive fucking number of jobs being lost, but also being one of the last major employment options for people that can't cut it as office workers.
The key element for it to work is that all that activity is not claimed to be useful simply because it either creates ROI or supports a hierarchy.
Which is going to be a hell of a thing to abandon. Communism and socialism still haven’t figured out how they went from Soviets to Stalin or why union bosses keep being, well, bosses.
Conservatives and neoLiberals are of course very far from admitting there’s anything wrong with human indentured labor or discarding people not contributing to ROI.
Anarchy (and the anarchic left) is too much rooted in anti-industrialism to accept that an automated back-end is needed to prop up a human-centric living museum. I think they are on the most viable path, but that’s a subset of a subset.
Western political ideologies are shaped by schismogenesis and that takes time, so I don’t see the whole thing materialising.
Maybe China figures it out?
This is a non argument. We have no idea what makes people tick yet we manage to make functional children turn into adults all the time. You don't have to understand something to achieve a goal.
Humanity had no idea how fire actually worked for most of the technologies existence. You just do a set of actions and you get warmth and heat.
The idea that we have to understand people to make sentience is trivially irrelevant.
But also you didn't really get my point, since I was saying "don't worry about the meaning stuff rght now, there's more important practical concerns" which you dismissed then reiterated.
They do not make AI from scratch. They create a set of rules and the AI figures out most of it on their own. We already have no real idea how most of these AI make any given decision. They are black boxes.
I'm sorry dude, it's a non counter point. It isn't real.
It's entirely possible the reason we haven't worked out sentience is because it's just an invented fiction. Claiming we have to achieve that without considering whether it's a concept that has any value is a statement dripping with the arrogant assumption that "humans are special".
I'm open to both paths because I'm a hardcore empiricist and I have to follow the evidence. But right now the evidence very much isn't leaning towards assuming sentience is real. The current consensus in neuroscience is that consciousness runs parallel to the processing that occurs that actually causes you to do what you do, and this parallel process has very little actual control over what happens. It's not that you are what you think. You are regardless of what you think, most of the time.
Carpenters aren't the only people affected in this scenario though. The ability for consumers to buy tables (and other goods) that otherwise wouldn't be nearly as cheap if not completely unavailable is an a massive benefit to society. One I'd argue is more valuable that artificially propping up a shrinking industry when improved welfare and retraining would work just as well.
It sucks for the people who want to make their living doing that specific job but they're a tiny sliver of the population compared to those who's lives are improved through mass production.
Also you can really easily have internal voices that just split off and start running their own autonomous consciousness thread. Sure, your own internal voice is your one true self. Absolutely. (Edit: /sarcasm)
But then there’s folks that narrate everything that happens as an observation. And people that don’t have an internal voice at all.
And when people do that, the natural state is happiness.
Theres a lot of good psychological and neuroscientific evidence tbh that supports that the big B got it right, and what we think of as a self is a pretty pervasive illusion.
So if we did have an AI that was equivalent to a human, would we even recognise it, or would we fault it for not having constructs that may not even be real. 8ball points to yes.
https://www.vice.com/en/article/5d37za/voice-
actors-sign-away-rights-to-artificial-intelligence
You sort of have to realize their are two facets to things. Let's take carpentry. It's good, honest, beautiful, if backbreaking work. While machines have killed the most of it, handmade wooden goods are still in high demand and highly valued. Case in point we bought our "forever" table recently. It could range from a few hundred for a machine made thing to thousands for a hand made thing. You could also get a few thousand machine made thing but that doesn't compared to the handmade one let alone a really good handmade one.
Now not all of that handmade one was handmade. I'm sure machines were involved with the cutting of the tree, the cutting of the basics parts but the craft was all the couple who made it.
The solution to all this is some sort of UBI. In our case we bought a nice table for sub five figures from an old retired couple that just made wood goods because they like to make wood goods. They gave us a discount and then we realized they made bowls and spoons and all sorts off things. A friend of mine is a blacksmith. By which I mean he's retired military who lives in rural Georgia on his pension and liked making things out of metal so he bought a huge tract of land and set up a forge and makes craft knives and other nice things. He sells them for hundreds to thousands and they are all works of art.
The catch is these people are able to do what they do because they have a set income each month and don't have to worry about the bills. We can entirely robot and AI most peoples jobs away and let them do what they love to do if we agree to give everyone a basic quality of life. If my house had all our things set and did not have to work I'd brew beer, roll cigars, make pickles, and play games. The SO would sing and record songs, along with making food for everyone in her life and we'd be happy and spend more time with our cat.
We can have robot made stuff and AI done stuff and still make, trade, sell, and gift goods we put our heart into. The only reason it's either or is raw damn greed.
Rich artists and undermining new artists, name a more iconic combo
Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.
But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.
We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.
Eh... that was a silly response. Lots of people make a lot of money predicting the stock market. There is a lot of rational basis for prediction there. Whether or not you can be guaranteed to be correct, or guaranteed enough to warrant a risky position, those are different matters.
However, past the AI singularity, there is no possibility of any rational basis to make any reasonable prediction, accuracy notwithstanding. That's why it's called a singularity.
Statistically speaking, all of the studies have shown that your best bet when investing stocks is to invest in mutual funds. Otherwise you're just gambling, and the house always wins.
Did you mean index funds? Because I am pretty sure that generally mutual funds are managed portfolios, so you are paying someone to predict the market for you.
Sorry yes. If I'm bad at investing terminology, it's because I'm good at recognizing that micromanaging my investments is a bad idea.
1. Go fuck yourself I'm not paying for the privileged of being allowed to leave your store sooner.
2. Capitalism destroys another useful aspect of automation.
https://www.youtube.com/watch?v=l7tWoPk25yU&t=1s
My only critique of the video is that while a human can't conceptualize how a bot interprets the worlds, I would argue it's a misnomer to call it a black box; even when we consider how many of the assholes pushing this stuff, likely have no idea how it fucking works. Let's be honest, while inventor capitalist dipshit bros has no idea how any of this shit works, if they have any basic business sense and control over a company, they are going to insist on setting things up so that the AI consistently return results that they want. Yes, people will find areas to exploit, but that isn't really new for anyone that has been paying any attention to computer security software.
Also, my old man was talking about, how the pentagon had developed a sentry bot and then tasked US marines to find it's weak point. Found an article about that here. Essentially, they found ways to beat it. One of these methods included doing somersaults while approaching it. Another method included the old cartoon trope of approaching while disguised as a tree. Finally, we had an approach that just went the Metal Gear Solid route and when with the trusty cardboard box approach. Yes, your taxpayer dollars hard at work recreating the Metal Gear Solid grunt NPC AI.
AI guided mannequins are starting to replace regular mannequins and actors for training nurses.
Unfortunately it currently comes with a $170k price tag.
Since we've been cluttering up the Crypto thread with LLMs and whatnot, and I've been doing an absurd amount of research on the topic lately, thought I'd share this very good link on the topic of AI art + writing that really goes into a solid balanced view, incorporating both the "this really sucks in a lot of ways" and "there's some great stuff that can come from this" sides, coming from a person whose life is about writing: https://www.youtube.com/watch?v=9xJCzKdPyCo&ab_channel=HelloFutureMe
As part of my personal projects, and to get some practice in, I'm going to be looking into using LLMs to help me expand on my voice game so it can offer better descriptions and more diverse locations for users, and generally expand the content. As part of this I'm going to be seeking out the most ethical variant of this technology I can, and using my own writing as data instead of just sucking on internet scrapings. I may also do something with the art as well, if I can find an ethically-sourced generator, but only after consulting with my artist and acquiring their consent. If they are not interested, I may use my own art instead. My work is specifically intended to be an example of making games available to people with visual and physical impairments, so being able to make that even easier for people to copy could do a lot of good, and I'm hoping I can ALSO use it as an example of how to use this technology ethically.