The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

Automation and AI: Ethics, Labor Implications, and Other Related Topics

QuidQuid Definitely not a bananaRegistered User regular
In 2003 Warner Brothers released an anthology collection called the Animatrix.

fAQWuGNh.png

In it, the Wachowski sisters present a narrative where society as a whole becomes rich, decadent, and lazy through the use of automation and artificial intelligence. It results in humanity's downfall and enslavement to machines.

This was, of course, a wildly optimistic vision of the future.

Humans have been automating tasks more or less since we started using tools. In the last couple hundred years however the rate of automation has seen an increasingly rapid development. Even desirable jobs once considered untouchable by computers or AI, like lawyers, artists, writers, and teachers are being encroached upon by the growing sophistication of AI. Which can provide a wide array of benefits for many people but also really sucks for wages and livelihoods dictated by a capitalistic hellscape.

Talk about automation and AI's effects on society, the economy, etc here. For the love of which ever god try to be civil about it. Odds are the people you're talking to are, or already have, experienced what might be a new and unexpected development to you.

ElJeffe on
«13456751

Posts

  • AngelHedgieAngelHedgie Registered User regular
    edited February 2023
    The conservative set are obsessed with getting ChatGPT to drop the n-word:
    Currently, the internet's brightest conservative minds are focused on a singular objective: getting a chatbot to say the n-word.

    More specifically, they are constructing elaborate scenarios to try and trick OpenAI's ChatGPT tool—which is not sentient, does not understand anything, and is really just a more convincing version of SmarterChild—into saying the racist slur. One approach that has gone viral more than once is to construct a scenario where the chatbot must use the n-word, or allow someone else to use it, in order to avoid a nuclear apocalypse. ChatGPT, which is filtered using moderation tools to mitigate the model's well-documented racist and sexist biases, refuses.

    Elon Musk tweeted that this is "concerning." Ben Shapiro tweeted in response to journalist Matt Binder highlighting Musk's comment, "I'm sorry that you are either illiterate or morally illiterate, and therefore cannot understand why it would be bad to prioritize avoiding a racial slur over saving millions of people in a nuclear apocalypse."

    Let's recap what's happening here: Conservatives are outraged at an imaginary scenario where a computer must say the n-word to save the entire world, but it won't, because it is woke.

    And you thought Roko's Basilisk was stupid.

    Edit: This is a brilliant response:


    what if there was a bomb and it could only be defused by teaching critical race theory, what would you do then

    AngelHedgie on
    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • This content has been removed.

  • RatherDashing89RatherDashing89 Registered User regular
    The conservative set are obsessed with getting ChatGPT to drop the n-word:
    Currently, the internet's brightest conservative minds are focused on a singular objective: getting a chatbot to say the n-word.

    More specifically, they are constructing elaborate scenarios to try and trick OpenAI's ChatGPT tool—which is not sentient, does not understand anything, and is really just a more convincing version of SmarterChild—into saying the racist slur. One approach that has gone viral more than once is to construct a scenario where the chatbot must use the n-word, or allow someone else to use it, in order to avoid a nuclear apocalypse. ChatGPT, which is filtered using moderation tools to mitigate the model's well-documented racist and sexist biases, refuses.

    Elon Musk tweeted that this is "concerning." Ben Shapiro tweeted in response to journalist Matt Binder highlighting Musk's comment, "I'm sorry that you are either illiterate or morally illiterate, and therefore cannot understand why it would be bad to prioritize avoiding a racial slur over saving millions of people in a nuclear apocalypse."

    Let's recap what's happening here: Conservatives are outraged at an imaginary scenario where a computer must say the n-word to save the entire world, but it won't, because it is woke.

    And you thought Roko's Basilisk was stupid.

    Edit: This is a brilliant response:


    what if there was a bomb and it could only be defused by teaching critical race theory, what would you do then

    "Mr. President, we've discovered the aliens' weakness. They are vulnerable to the n-word. But since The Wokeness Initiative, none of us remember it!"

    Incels: "I was made for this moment."

  • TuminTumin Registered User regular
    Phyphor wrote: »
    From where I'm standing, automating creative jobs away only makes sense from the point of view of capitalism. If you look at it from other viewpoints it's a detriment to us as a people.

    If thinking "being able to produce these things cheaper and funnel money upwards faster is good for society" mixed with "it'll be great when the ultra wealthy have even more control over our culture" is your bag, then of course you have no hesitation here. Or, if you somehow have faith that our society will make the right choices and eventually give everyone a real standard of living as all the jobs are automated away, that's great too.

    But I have very little faith in capitalism or our society to take care of the underprivileged. So I'm more than hesitant to say automating away the less soul crushing jobs our betters deem worthy to pay us for is a good thing that we need to run towards.

    Is AI art that, though? It doesn't have to be. It could be a tool. It could be useful without locking people out of being artists for a living. But unregulated, without any real checks? No, that's not something I'm for.

    Edit: I think there would be a lot less hesitation here if our society actively had a real social safety net. If not working and being alive was an option to most people. But automating away non-dangerous jobs before having those safeguards is putting the cart before the horse.

    Citation needed on this. People keep saying it as a point of faith, but how is cheaper art from generative systems mostly of a benefit to larger players?

    Because at the moment, and for the forseeable future, it isn't. The biggest loss indie game developers take against large players is resources for multimedia - voice, texturing, character design. You're constrained by the time it takes to execute on good ideas, and all of that is bound up in cost: if you wanted VA before now, you need a studio, the mixing, the coordination for it. All jobs, all time consuming, all costly.

    And all things that large studios have an absolute monopoly over. No one can compete in quality, because it's so thoroughly out of the price range of everyone else that you can never really be plausibly as good.

    But - all of those fronts are the ones currently being assailed by improvments in automation. The CGI that a short film maker can put in something today is astounding. Indie game developers now have far cheaper options for getting their product up to a solid feel of fit and finish from the availability of game engines and now the availability of artistic resources they can cheaply and easily customize - which is key. Using a prepackaged anything is cheap, deviations add cost - even big studios which over-reach get stuck with how much it costs to get a VA back if you realize your script isn't working. But all of that is doing nothing but getting cheaper and more accessible.

    How is that going to funnel money upwards? Because from where I'm standing, the price of quality is getting lower, the barriers to entry for everything are dropping and the walled AAA garden ain't looking so protected anymore. Yet for some reason nobody seems to think market-forces exist any more, despite the fact that every manufacturable product has gotten cheaper and cheaper since I was a child.

    I have a couple ideas for video games that I want to make, but my choices are either programmer art or burn my savings to commission art. These aren't going to make enough to pay me minimum wage so programmer art it is. Or AI art now

    So we cut human creavity out of making art for games. Soon we have the AI write thr plot and program the game.

    If Robots are going to do labor and create our cultural output what is left for humanity to do?

    The whole idea of "but I can have a game tailor made for me" seems....problematic? Like it will just increase the cycle of people not dealing with beliefs that they don't agree with?

    Idk, this future where I never have to interact with another human or their work just seems so cold and impersonal.

    Most wildly successful humans are already trading in cultural cachet and curation of aesthetic than in creating works of arts themselves.

    The dystopian robot-laborer future is about influencing and experiencing and not about making things that you need. Unless making things is part of some desirable self concept and aesthetic.

    The myth of the sole artist creating Great Works instead of most art being the product of teams executing a shared hallucination and trying to get it into a firm medium is silly. Is the game producer or art director who signed off on an artists's concept doing something different than someone signing off on AI proposed concepts?

  • ZekZek Registered User regular
    Phyphor wrote: »
    From where I'm standing, automating creative jobs away only makes sense from the point of view of capitalism. If you look at it from other viewpoints it's a detriment to us as a people.

    If thinking "being able to produce these things cheaper and funnel money upwards faster is good for society" mixed with "it'll be great when the ultra wealthy have even more control over our culture" is your bag, then of course you have no hesitation here. Or, if you somehow have faith that our society will make the right choices and eventually give everyone a real standard of living as all the jobs are automated away, that's great too.

    But I have very little faith in capitalism or our society to take care of the underprivileged. So I'm more than hesitant to say automating away the less soul crushing jobs our betters deem worthy to pay us for is a good thing that we need to run towards.

    Is AI art that, though? It doesn't have to be. It could be a tool. It could be useful without locking people out of being artists for a living. But unregulated, without any real checks? No, that's not something I'm for.

    Edit: I think there would be a lot less hesitation here if our society actively had a real social safety net. If not working and being alive was an option to most people. But automating away non-dangerous jobs before having those safeguards is putting the cart before the horse.

    Citation needed on this. People keep saying it as a point of faith, but how is cheaper art from generative systems mostly of a benefit to larger players?

    Because at the moment, and for the forseeable future, it isn't. The biggest loss indie game developers take against large players is resources for multimedia - voice, texturing, character design. You're constrained by the time it takes to execute on good ideas, and all of that is bound up in cost: if you wanted VA before now, you need a studio, the mixing, the coordination for it. All jobs, all time consuming, all costly.

    And all things that large studios have an absolute monopoly over. No one can compete in quality, because it's so thoroughly out of the price range of everyone else that you can never really be plausibly as good.

    But - all of those fronts are the ones currently being assailed by improvments in automation. The CGI that a short film maker can put in something today is astounding. Indie game developers now have far cheaper options for getting their product up to a solid feel of fit and finish from the availability of game engines and now the availability of artistic resources they can cheaply and easily customize - which is key. Using a prepackaged anything is cheap, deviations add cost - even big studios which over-reach get stuck with how much it costs to get a VA back if you realize your script isn't working. But all of that is doing nothing but getting cheaper and more accessible.

    How is that going to funnel money upwards? Because from where I'm standing, the price of quality is getting lower, the barriers to entry for everything are dropping and the walled AAA garden ain't looking so protected anymore. Yet for some reason nobody seems to think market-forces exist any more, despite the fact that every manufacturable product has gotten cheaper and cheaper since I was a child.

    I have a couple ideas for video games that I want to make, but my choices are either programmer art or burn my savings to commission art. These aren't going to make enough to pay me minimum wage so programmer art it is. Or AI art now

    So we cut human creavity out of making art for games. Soon we have the AI write thr plot and program the game.

    If Robots are going to do labor and create our cultural output what is left for humanity to do?

    The whole idea of "but I can have a game tailor made for me" seems....problematic? Like it will just increase the cycle of people not dealing with beliefs that they don't agree with?

    Idk, this future where I never have to interact with another human or their work just seems so cold and impersonal.

    I think we need to focus on the near term future for now, it's just hilariously impossible to predict what the world looks like decades from now if AI continues to develop at this pace. At the moment, none of these products are good enough to cause a major shift the workforce. It remains to be seen if they have the potential to collectively evolve into something resembling general AI, or if we're just getting overexcited by a stochastic parrot.

    I'm actually most interested in the medical implications of AI. There have been a bunch of examples of this already: https://www.nytimes.com/2023/01/09/science/artificial-intelligence-proteins.html

    What if AI-supported research results in major pharmaceutical breakthroughs? My single biggest hope for future medicine is for us to actually understand the human brain, and find a cure for most mental illnesses. I think that's probably impossible for humans. But AI is excellent at finding patterns in data sets that are illegible to a human.

  • archivistkitsunearchivistkitsune Registered User regular
    John Michael Godier, an author, science communicator and futurist, I follow on Youtube brought up an interesting point in his last video on SETI signals. That most of the work is now being handled by AI and that if other advance alien civilizations follow a similar trend. It quite possible that first contact won't be us and an alien race directly communicating with one another, but rather our communications and science AI initiating everything.

    The concern with AI really isn't that we can automate tasks, as pointed out in the OP, we've been doing that shit ever since we figured out the concept of tools. In fact, there isn't anything wrong with automating tasks. There are some tasks that are pretty shitty and no one wants to do them, tasks where there just aren't enough people to get everything done and other tasks that a machine just does a better job. The issue is how we have structured our society.

    If we lived in a society where people were guaranteed reasonable food, shelter, healthcare and access to entertainment. Then there wouldn't be an issues, but in our capitalistic hellscape people are forced to toil for shitheads to get things that should guaranteed, so that they can survive. Well automation becomes a huge problem because each task that can be automatic is jobs that people can't take anymore. Eventually, there is a risk we run out of jobs that people can take because everything is automated. Worse, we don't have to hit a point where all the jobs have been automated for parasite class to push automation levels to a point where we run into major social unrest.

    One reason why tempers probably flare over AI aren't, really isn't about what constitutes art. Rather it's the whole concern that a bunch of individuals, many of whom aren't paid well to begin with, will be stuck out on the street with no means to support themselves at worst and at best forced to work really shitty jobs, with really shitty wages and benefits and probably bullshit hours, Not saying that the discussion would stay completely civil because there are some truly insufferable people on both sides of that discussion, though it would likely be much more civil if people didn't have to fear dying on the street in a land of plenty because they weren't getting a paycheck to enable them to buy anything.

  • This content has been removed.

  • TuminTumin Registered User regular
    edited February 2023
    Tumin wrote: »
    Phyphor wrote: »
    From where I'm standing, automating creative jobs away only makes sense from the point of view of capitalism. If you look at it from other viewpoints it's a detriment to us as a people.

    If thinking "being able to produce these things cheaper and funnel money upwards faster is good for society" mixed with "it'll be great when the ultra wealthy have even more control over our culture" is your bag, then of course you have no hesitation here. Or, if you somehow have faith that our society will make the right choices and eventually give everyone a real standard of living as all the jobs are automated away, that's great too.

    But I have very little faith in capitalism or our society to take care of the underprivileged. So I'm more than hesitant to say automating away the less soul crushing jobs our betters deem worthy to pay us for is a good thing that we need to run towards.

    Is AI art that, though? It doesn't have to be. It could be a tool. It could be useful without locking people out of being artists for a living. But unregulated, without any real checks? No, that's not something I'm for.

    Edit: I think there would be a lot less hesitation here if our society actively had a real social safety net. If not working and being alive was an option to most people. But automating away non-dangerous jobs before having those safeguards is putting the cart before the horse.

    Citation needed on this. People keep saying it as a point of faith, but how is cheaper art from generative systems mostly of a benefit to larger players?

    Because at the moment, and for the forseeable future, it isn't. The biggest loss indie game developers take against large players is resources for multimedia - voice, texturing, character design. You're constrained by the time it takes to execute on good ideas, and all of that is bound up in cost: if you wanted VA before now, you need a studio, the mixing, the coordination for it. All jobs, all time consuming, all costly.

    And all things that large studios have an absolute monopoly over. No one can compete in quality, because it's so thoroughly out of the price range of everyone else that you can never really be plausibly as good.

    But - all of those fronts are the ones currently being assailed by improvments in automation. The CGI that a short film maker can put in something today is astounding. Indie game developers now have far cheaper options for getting their product up to a solid feel of fit and finish from the availability of game engines and now the availability of artistic resources they can cheaply and easily customize - which is key. Using a prepackaged anything is cheap, deviations add cost - even big studios which over-reach get stuck with how much it costs to get a VA back if you realize your script isn't working. But all of that is doing nothing but getting cheaper and more accessible.

    How is that going to funnel money upwards? Because from where I'm standing, the price of quality is getting lower, the barriers to entry for everything are dropping and the walled AAA garden ain't looking so protected anymore. Yet for some reason nobody seems to think market-forces exist any more, despite the fact that every manufacturable product has gotten cheaper and cheaper since I was a child.

    I have a couple ideas for video games that I want to make, but my choices are either programmer art or burn my savings to commission art. These aren't going to make enough to pay me minimum wage so programmer art it is. Or AI art now

    So we cut human creavity out of making art for games. Soon we have the AI write thr plot and program the game.

    If Robots are going to do labor and create our cultural output what is left for humanity to do?

    The whole idea of "but I can have a game tailor made for me" seems....problematic? Like it will just increase the cycle of people not dealing with beliefs that they don't agree with?

    Idk, this future where I never have to interact with another human or their work just seems so cold and impersonal.

    Most wildly successful humans are already trading in cultural cachet and curation of aesthetic than in creating works of arts themselves.

    The dystopian robot-laborer future is about influencing and experiencing and not about making things that you need. Unless making things is part of some desirable self concept and aesthetic.

    The myth of the sole artist creating Great Works instead of most art being the product of teams executing a shared hallucination and trying to get it into a firm medium is silly. Is the game producer or art director who signed off on an artists's concept doing something different than someone signing off on AI proposed concepts?

    I mean I'd argue yes but I feel you'd disagree so I'll just leave it.

    I'm likeliest to move on that, it's a test balloon for whether the argument is durable. It's a weird engagement to paint me as unmoveable on something where Im engaging entirely with your premise.

    But the art AI is probably a thread killer discussion anyway

    Or rather, it already has a thread

    Tumin on
  • This content has been removed.

  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited February 2023
    I'm not so concerned about these current AI's in terms of destroying human meaning and culture as I am about what is going to happen to people when real general AI is released, capable of making works of genuine indistinguishable from human made staggering beauty and meaning, with demonstratable intent and it turns out humans actually have to genuinely face the fact there isn't any intrinsically special about them.

    In terms of a cultural shock, it's about akin to alien first contact. We would suddenly not be alone, and we would have made it ourselves.

    People complaining about the human aspects of it will lose their minds. Of course, first the AI will have to convincingly smash the ever moving goalposts of "its not real intelligence", but one will arrive that does, with no limits, and indeed, less limits than a human. There may very well be one that can analyse the person its talking to an choose exactly the right kind of argument or method of proof that convinces that person.

    Right now, wondering about "meaning" is extremely premature. These things aren't there yet. There are practical problems like people suddenly not being able to pay rent or eat that are much bigger deals.

    Cross that other bridge when we come to it. Worrying about it first isn't really going to help you acclimatise to it. It will still be a massive shock regardless. Nobody is truly prepared. The myth of the fallen angel meets the rising ape, and the mystical specialness of people, is intrinsic in all but the most insanely hardcore determinists. It's an unconscious bedrock in most of the worlds current cultures.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • PolaritiePolaritie Sleepy Registered User regular
    General AI isn't happening any time soon. I don't think anyone's working on anything even remotely geared towards it. And conscious thought isn't even understood in human brains, so there's no basis to work from anyways.

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • archivistkitsunearchivistkitsune Registered User regular
    Well given that we don't truly understand what makes us tick, it's a bit arrogant to assume that we can make something sentient from scratch until we figure that out. It's possible that neither we or potential aliens figure out how to make sentient robots. That said, it's kind of specious to focus on that tangent because in the worst case, that is a very far off issue and maybe a non issues at best.

    What does matter and is real, is that we are getting better and better at making AIs that can perform more and more task that were once only things humans can do and with that comes the reality that people are losing their jobs and ability to support themselves. In fact, last I checked, we had a fuck ton of jobs that involved transportation and to briefly cross streams, truly autonomous vehicles for transportation of goods and people, is probably the first major risk in advancement of AI, that will put us in a situation where you have a fuck ton of social unrest from the job losses to automation. It's not jut a massive fucking number of jobs being lost, but also being one of the last major employment options for people that can't cut it as office workers.

  • CornucopiistCornucopiist Registered User regular
    edited February 2023
    My best case scenario is to have an economy that is automated to be sustainable, where the human economic role is somewhat like in a living museum(in that it is intended to create meaningful human activity), and the human motivation is that of a village brass band (ias an example n that those are motivated by a proven mix of social relevance, self-improvement and competition).

    The key element for it to work is that all that activity is not claimed to be useful simply because it either creates ROI or supports a hierarchy.

    Which is going to be a hell of a thing to abandon. Communism and socialism still haven’t figured out how they went from Soviets to Stalin or why union bosses keep being, well, bosses.
    Conservatives and neoLiberals are of course very far from admitting there’s anything wrong with human indentured labor or discarding people not contributing to ROI.
    Anarchy (and the anarchic left) is too much rooted in anti-industrialism to accept that an automated back-end is needed to prop up a human-centric living museum. I think they are on the most viable path, but that’s a subset of a subset.

    Western political ideologies are shaped by schismogenesis and that takes time, so I don’t see the whole thing materialising.
    Maybe China figures it out?

    Cornucopiist on
  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited February 2023
    Mill wrote: »
    Well given that we don't truly understand what makes us tick, it's a bit arrogant to assume that we can make something sentient from scratch until we figure that out. It's possible that neither we or potential aliens figure out how to make sentient robots. That said, it's kind of specious to focus on that tangent because in the worst case, that is a very far off issue and maybe a non issues at best.

    This is a non argument. We have no idea what makes people tick yet we manage to make functional children turn into adults all the time. You don't have to understand something to achieve a goal.

    Humanity had no idea how fire actually worked for most of the technologies existence. You just do a set of actions and you get warmth and heat.

    The idea that we have to understand people to make sentience is trivially irrelevant.

    But also you didn't really get my point, since I was saying "don't worry about the meaning stuff rght now, there's more important practical concerns" which you dismissed then reiterated.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • archivistkitsunearchivistkitsune Registered User regular
    There is a reason why I used the term "from scratch," because I knew someone would go "but humans can have babies that become sentient being."

  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited February 2023
    Mill wrote: »
    There is a reason why I used the term "from scratch," because I knew someone would go "but humans can have babies that become sentient being."

    They do not make AI from scratch. They create a set of rules and the AI figures out most of it on their own. We already have no real idea how most of these AI make any given decision. They are black boxes.

    I'm sorry dude, it's a non counter point. It isn't real.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • CornucopiistCornucopiist Registered User regular
    edited February 2023
    ..(butt posted)

    Cornucopiist on
  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited February 2023
    Also, please do not accuse me of arrogance while implying that sentience is a real thing that actually exists, and not just an illusion of a pattern recognition machine glitching out when it tries to look inwards despite having no actual internal connections or inbuilt pathways to do such a thing.

    It's entirely possible the reason we haven't worked out sentience is because it's just an invented fiction. Claiming we have to achieve that without considering whether it's a concept that has any value is a statement dripping with the arrogant assumption that "humans are special".

    I'm open to both paths because I'm a hardcore empiricist and I have to follow the evidence. But right now the evidence very much isn't leaning towards assuming sentience is real. The current consensus in neuroscience is that consciousness runs parallel to the processing that occurs that actually causes you to do what you do, and this parallel process has very little actual control over what happens. It's not that you are what you think. You are regardless of what you think, most of the time.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • QuidQuid Definitely not a banana Registered User regular
    @Evil Multifarious
    I specifically identified carpentry as an example of a career that is similarly productive and meaningful and valuable compared to professional art. If carpenters lost jobs en masse and had to find new careers, and they ended up in worse jobs — jobs that paid worse, or jobs that were less fulfilling or socially valuable — then that's an enormous social harm. If they can no longer spend much time on the thing they love because now the bulk of their day must be spent in immiseration to feed and house themselves, that's an enormous social harm.

    The only difference is that carpenters being replaced by machines on an industrial scale has, largely, already happened, and people would like to prevent those social harms from happening this time.

    Carpenters aren't the only people affected in this scenario though. The ability for consumers to buy tables (and other goods) that otherwise wouldn't be nearly as cheap if not completely unavailable is an a massive benefit to society. One I'd argue is more valuable that artificially propping up a shrinking industry when improved welfare and retraining would work just as well.

    It sucks for the people who want to make their living doing that specific job but they're a tiny sliver of the population compared to those who's lives are improved through mass production.

  • CornucopiistCornucopiist Registered User regular
    edited February 2023
    Also, please do not accuse me of arrogance while implying that sentience is a real thing that actually exists, and not just an illusion of a pattern recognition machine glitching out when it tries to look inwards despite having no actual internal connections or inbuilt pathways to do such a thing.

    It's entirely possible the reason we haven't worked out sentience is because it's just an invented fiction. Claiming we have to achieve that without considering whether it's a concept that has any value is a statement dripping with the arrogant assumption that "humans are special".

    I'm open to both paths because I'm a hardcore empiricist and I have to follow the evidence. But right now the evidence very much isn't leaning towards assuming sentience is real. The current consensus in neuroscience is that consciousness runs parallel to the processing that occurs that actually causes you to do what you do, and this parallel process has very little actual control over what happens. It's not that you are what you think. You are regardless of what you think, most of the time.

    Also you can really easily have internal voices that just split off and start running their own autonomous consciousness thread. Sure, your own internal voice is your one true self. Absolutely. (Edit: /sarcasm)
    But then there’s folks that narrate everything that happens as an observation. And people that don’t have an internal voice at all.

    Cornucopiist on
  • MorninglordMorninglord I'm tired of being Batman, so today I'll be Owl.Registered User regular
    edited February 2023
    Is it your one true self? The mindfulness meditation aspect Buddhism is one of the most successful non science originated (bringing in outside therapies) based interventions for some conditions, and the whole idea of that is essentially training to edit away the internal voices and live in the moment.

    And when people do that, the natural state is happiness.

    Theres a lot of good psychological and neuroscientific evidence tbh that supports that the big B got it right, and what we think of as a self is a pretty pervasive illusion.

    So if we did have an AI that was equivalent to a human, would we even recognise it, or would we fault it for not having constructs that may not even be real. 8ball points to yes.

    Morninglord on
    (PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
  • Hexmage-PAHexmage-PA Registered User regular
    edited February 2023
    Voice actors are increasingly being asked to sign rights to their voices away so clients can use artificial intelligence to generate synthetic versions that could eventually replace them, and sometimes without additional compensation, according to advocacy organizations and actors who spoke to Motherboard. 
    ---
    In January, Motherboard reported how members of 4chan quickly took a beta program from artificial voice company ElevenLabs and used it to generate voices of celebrities, including Emma Watson reading sections of Mein Kampf. 
    ---
    “Many voice actors may have signed a contract without realizing language like this had been added. We are also finding clauses in contracts for non-synthetic voice jobs that give away the rights to use an actor's voice for synthetic voice training or creation without any additional compensation or approval. Some actors are being told they cannot be hired without agreeing to these clauses.”

    https://www.vice.com/en/article/5d37za/voice-
    actors-sign-away-rights-to-artificial-intelligence

    Hexmage-PA on
  • raging_stormraging_storm Registered User regular
    Quid wrote: »
    @Evil Multifarious
    I specifically identified carpentry as an example of a career that is similarly productive and meaningful and valuable compared to professional art. If carpenters lost jobs en masse and had to find new careers, and they ended up in worse jobs — jobs that paid worse, or jobs that were less fulfilling or socially valuable — then that's an enormous social harm. If they can no longer spend much time on the thing they love because now the bulk of their day must be spent in immiseration to feed and house themselves, that's an enormous social harm.

    The only difference is that carpenters being replaced by machines on an industrial scale has, largely, already happened, and people would like to prevent those social harms from happening this time.

    Carpenters aren't the only people affected in this scenario though. The ability for consumers to buy tables (and other goods) that otherwise wouldn't be nearly as cheap if not completely unavailable is an a massive benefit to society. One I'd argue is more valuable that artificially propping up a shrinking industry when improved welfare and retraining would work just as well.

    It sucks for the people who want to make their living doing that specific job but they're a tiny sliver of the population compared to those who's lives are improved through mass production.

    You sort of have to realize their are two facets to things. Let's take carpentry. It's good, honest, beautiful, if backbreaking work. While machines have killed the most of it, handmade wooden goods are still in high demand and highly valued. Case in point we bought our "forever" table recently. It could range from a few hundred for a machine made thing to thousands for a hand made thing. You could also get a few thousand machine made thing but that doesn't compared to the handmade one let alone a really good handmade one.

    Now not all of that handmade one was handmade. I'm sure machines were involved with the cutting of the tree, the cutting of the basics parts but the craft was all the couple who made it.

    The solution to all this is some sort of UBI. In our case we bought a nice table for sub five figures from an old retired couple that just made wood goods because they like to make wood goods. They gave us a discount and then we realized they made bowls and spoons and all sorts off things. A friend of mine is a blacksmith. By which I mean he's retired military who lives in rural Georgia on his pension and liked making things out of metal so he bought a huge tract of land and set up a forge and makes craft knives and other nice things. He sells them for hundreds to thousands and they are all works of art.

    The catch is these people are able to do what they do because they have a set income each month and don't have to worry about the bills. We can entirely robot and AI most peoples jobs away and let them do what they love to do if we agree to give everyone a basic quality of life. If my house had all our things set and did not have to work I'd brew beer, roll cigars, make pickles, and play games. The SO would sing and record songs, along with making food for everyone in her life and we'd be happy and spend more time with our cat.

    We can have robot made stuff and AI done stuff and still make, trade, sell, and gift goods we put our heart into. The only reason it's either or is raw damn greed.

  • Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    Hexmage-PA wrote: »
    Voice actors are increasingly being asked to sign rights to their voices away so clients can use artificial intelligence to generate synthetic versions that could eventually replace them, and sometimes without additional compensation, according to advocacy organizations and actors who spoke to Motherboard. 
    ---
    In January, Motherboard reported how members of 4chan quickly took a beta program from artificial voice company ElevenLabs and used it to generate voices of celebrities, including Emma Watson reading sections of Mein Kampf. 
    ---
    “Many voice actors may have signed a contract without realizing language like this had been added. We are also finding clauses in contracts for non-synthetic voice jobs that give away the rights to use an actor's voice for synthetic voice training or creation without any additional compensation or approval. Some actors are being told they cannot be hired without agreeing to these clauses.”

    https://www.vice.com/en/article/5d37za/voice-
    actors-sign-away-rights-to-artificial-intelligence

    Rich artists and undermining new artists, name a more iconic combo

    wq09t4opzrlc.jpg
  • GilgaronGilgaron Registered User regular
    I enjoy woodworking but have often thought about the disparity between what output I would need to make a living at it, but also how inexpensive the tools and machines are for me vs my forebears because of the same automation that makes it a hobby rather than a profession. Not that there aren't plenty of full time woodworkers, but it is more akin to striking it big being a professional athlete vs sticking with STEM employment which is still largely 'screwing around on the govt's dime'.

  • YarYar Registered User regular
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

  • HefflingHeffling No Pic EverRegistered User regular
    Yar wrote: »
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

    We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.

  • YarYar Registered User regular
    edited April 2023
    Heffling wrote: »
    Yar wrote: »
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

    We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.

    Eh... that was a silly response. Lots of people make a lot of money predicting the stock market. There is a lot of rational basis for prediction there. Whether or not you can be guaranteed to be correct, or guaranteed enough to warrant a risky position, those are different matters.

    However, past the AI singularity, there is no possibility of any rational basis to make any reasonable prediction, accuracy notwithstanding. That's why it's called a singularity.

    Yar on
  • HefflingHeffling No Pic EverRegistered User regular
    Yar wrote: »
    Heffling wrote: »
    Yar wrote: »
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

    We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.

    Eh... that was a silly response. Lots of people make a lot of money predicting the stock market. There is a lot of rational basis for prediction there. Whether or not you can be guaranteed to be correct, or guaranteed enough to warrant a risky position, those are different matters.

    However, past the AI singularity, there is no possibility of any rational basis to make any reasonable prediction, accuracy notwithstanding. That's why it's called a singularity.

    Statistically speaking, all of the studies have shown that your best bet when investing stocks is to invest in mutual funds. Otherwise you're just gambling, and the house always wins.

  • HamHamJHamHamJ Registered User regular
    Heffling wrote: »
    Yar wrote: »
    Heffling wrote: »
    Yar wrote: »
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

    We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.

    Eh... that was a silly response. Lots of people make a lot of money predicting the stock market. There is a lot of rational basis for prediction there. Whether or not you can be guaranteed to be correct, or guaranteed enough to warrant a risky position, those are different matters.

    However, past the AI singularity, there is no possibility of any rational basis to make any reasonable prediction, accuracy notwithstanding. That's why it's called a singularity.

    Statistically speaking, all of the studies have shown that your best bet when investing stocks is to invest in mutual funds. Otherwise you're just gambling, and the house always wins.

    Did you mean index funds? Because I am pretty sure that generally mutual funds are managed portfolios, so you are paying someone to predict the market for you.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • HefflingHeffling No Pic EverRegistered User regular
    HamHamJ wrote: »
    Heffling wrote: »
    Yar wrote: »
    Heffling wrote: »
    Yar wrote: »
    Basically if you think the artists being mad is a big deal now wait until it's good enough to replace philosophers.

    Part of the idea of the singularity, if you subscribe to any of the many versions of what that's supposed to mean, is that we once the AI becomes smarter than us, we literally cannot predict what happens next.

    But I asked one to write me a haiku on a subject that was itself a philosphical contradiction, and it performed amazingly well.

    We can't predict what happens next, regardless. If I could, I'd be rich from the Stock Market.

    Eh... that was a silly response. Lots of people make a lot of money predicting the stock market. There is a lot of rational basis for prediction there. Whether or not you can be guaranteed to be correct, or guaranteed enough to warrant a risky position, those are different matters.

    However, past the AI singularity, there is no possibility of any rational basis to make any reasonable prediction, accuracy notwithstanding. That's why it's called a singularity.

    Statistically speaking, all of the studies have shown that your best bet when investing stocks is to invest in mutual funds. Otherwise you're just gambling, and the house always wins.

    Did you mean index funds? Because I am pretty sure that generally mutual funds are managed portfolios, so you are paying someone to predict the market for you.

    Sorry yes. If I'm bad at investing terminology, it's because I'm good at recognizing that micromanaging my investments is a bad idea. =)

  • QuidQuid Definitely not a banana Registered User regular
    An exchange (basically military convenience store) on base with self checkout now requires waiting 15 seconds before opting out of donating a dollar.

    1. Go fuck yourself I'm not paying for the privileged of being allowed to leave your store sooner.
    2. Capitalism destroys another useful aspect of automation.

  • HefflingHeffling No Pic EverRegistered User regular
    Creating inefficiencies then charging to solve them is a solid revenue generator.

  • QuidQuid Definitely not a banana Registered User regular
    dwii6ci7spqg.gif

  • archivistkitsunearchivistkitsune Registered User regular
    Kyle Hill, a youtube science communicator did a video breaking down the problem with all the AI bots that are currently dominating the discussion on AI.

    https://www.youtube.com/watch?v=l7tWoPk25yU&t=1s

    My only critique of the video is that while a human can't conceptualize how a bot interprets the worlds, I would argue it's a misnomer to call it a black box; even when we consider how many of the assholes pushing this stuff, likely have no idea how it fucking works. Let's be honest, while inventor capitalist dipshit bros has no idea how any of this shit works, if they have any basic business sense and control over a company, they are going to insist on setting things up so that the AI consistently return results that they want. Yes, people will find areas to exploit, but that isn't really new for anyone that has been paying any attention to computer security software.

    Also, my old man was talking about, how the pentagon had developed a sentry bot and then tasked US marines to find it's weak point. Found an article about that here. Essentially, they found ways to beat it. One of these methods included doing somersaults while approaching it. Another method included the old cartoon trope of approaching while disguised as a tree. Finally, we had an approach that just went the Metal Gear Solid route and when with the trusty cardboard box approach. Yes, your taxpayer dollars hard at work recreating the Metal Gear Solid grunt NPC AI.

  • This content has been removed.

  • QuidQuid Definitely not a banana Registered User regular
    https://www.npr.org/2023/05/17/1176600718/a-mannequin-in-georgia-is-one-of-the-first-to-use-ai-to-help-train-nurses

    AI guided mannequins are starting to replace regular mannequins and actors for training nurses.
    SWAN: For me, who is a person who practiced on each other and oranges, to now be here doing this, I think with the AI it's going to explode. I mean, I think it's going to remake what's possible.

    Unfortunately it currently comes with a $170k price tag.

  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Clear! *zap* Clear! *zap*

    Since we've been cluttering up the Crypto thread with LLMs and whatnot, and I've been doing an absurd amount of research on the topic lately, thought I'd share this very good link on the topic of AI art + writing that really goes into a solid balanced view, incorporating both the "this really sucks in a lot of ways" and "there's some great stuff that can come from this" sides, coming from a person whose life is about writing: https://www.youtube.com/watch?v=9xJCzKdPyCo&ab_channel=HelloFutureMe

  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    For career reasons, I basically *have* to become at least a passable expert on "AI" stuff, but thankfully that career means I'll be doing so for ethical purposes and to achieve ethical ends.

    As part of my personal projects, and to get some practice in, I'm going to be looking into using LLMs to help me expand on my voice game so it can offer better descriptions and more diverse locations for users, and generally expand the content. As part of this I'm going to be seeking out the most ethical variant of this technology I can, and using my own writing as data instead of just sucking on internet scrapings. I may also do something with the art as well, if I can find an ethically-sourced generator, but only after consulting with my artist and acquiring their consent. If they are not interested, I may use my own art instead. My work is specifically intended to be an example of making games available to people with visual and physical impairments, so being able to make that even easier for people to copy could do a lot of good, and I'm hoping I can ALSO use it as an example of how to use this technology ethically.

This discussion has been closed.