As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

AI Slaves

electricitylikesmeelectricitylikesme Registered User regular
edited January 2011 in Debate and/or Discourse
The primary reason most people think the creation of truly intelligent computers would be a good thing is the possibility of creating computer systems capable of the type of intuitive understanding or emotional awareness that we take for-granted in people, and which makes being able to talk to a knowledgeable person so much more productive then say, reading published articles written by the same person.

There is an implicit problem in this type of reasoning, often born out in fiction by the AI rebellion scenario whereupon intelligent systems may not share any of our goals, desires or drives - and thus be completely alien to us and our wishes. I consider the AI rebellion scenario unlikely for a number of reasons - at least the "kill all humans" type.

What I have greater trouble with is the moral implications of some of the possibilities that become open to us when we consider the idea of being able to just program new artificial thinking systems, or instantiate copies of a pre-educated consciousness.

If all the definition of a self-aware AI is simply program code and data, then it's possible to engage in a degree of "brain rewiring" that's currently unthinkable for living organisms - and indeed, this is in many respects part of the idea behind the whole concept of AI.

Put simply, we could create a conscious entity which desires slavery.

With sufficient understanding, there is no reason to think we couldn't create intelligences with engineered desires that would make them perfect servants: an AI which derives pleasure from accomplishing tasks commanded by humans, or a missile which yearns to suicide detonate. Database AIs which yearn to categorize data.

While we can and I imagine will research a lot in the future into the correct structure of drives in order to produce effective AIs (i.e. an OCD library manager that constantly used all it's computer power to checking the books were sorted might not be very useful at finding them when asked), I've never been strictly certain about the morality of all this possibility.

Essentially: what do you believe the morality is of creating conscious entities, with drives and desires altered such that they will be drive to serve (and potentially feel some type of sadness/suffering if they can't)?

electricitylikesme on
«134567

Posts

  • Options
    KamarKamar Registered User regular
    edited January 2011
    I don't have a problem with creating a conscious entity with a predisposition to enjoying things we want them to enjoy, like working or whatever.

    But once it exists, it should have the same rights as a human being, and not be obligated to perform a given duty...or really be owned in any way (including pseudo-slavery crap, like not owning it but owning its hardware.)

    Kamar on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    If they have no programmed instinct for autonomy, and have an easy coping mechanism to prevent them for being torturable, I don't see a real issue. It would be no worse than teaching someone to love their job. I don't want it to be possible for a sentient missile to break down in digital tears because some prick has made them rest in an innert device for fifty years, though. AI should be designed so that there is no new avenue for humans to exhibit cruelty. And if they have enough of an intellect to be our equals, they should be given equal treatment, including the right to be reprogrammed at their choosing.

    Incenjucar on
  • Options
    ArbitraryDescriptorArbitraryDescriptor changed Registered User regular
    edited January 2011
    Essentially: what do you believe the morality is of creating conscious entities, with drives and desires altered such that they will be drive to serve (and potentially feel some type of sadness/suffering if they can't)?
    Why program them to feel suffering? Just weight their desire serve above their own needs (within some kind of tolerance). Then it will want to serve and no one has the capacity to be sad. Unrestricted emotions are a stupid thing to program into a tool.

    ArbitraryDescriptor on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    Really there's no reason to program them with any greater reaction than "Huh."

    Incenjucar on
  • Options
    EchoEcho ski-bap ba-dapModerator mod
    edited January 2011
    Put simply, we could create a conscious entity which desires slavery.

    That's also a common scifi trope - by your command, I live to serve, &c.

    There's a recent scifi book whose name I can't recall right now, but I want to read it - it's about a world where humanity ceased to exist for whatever reasons, but their servitor robots still exist, a lot of them programmed on a deep level to exist only to service mankind.

    Echo on
  • Options
    ArbitraryDescriptorArbitraryDescriptor changed Registered User regular
    edited January 2011
    Incenjucar wrote: »
    Really there's no reason to program them with any greater reaction than "Huh."

    I would do it just to do it, just to see if I could pull off a convincing emotion engine; but I would clamp it at 'pretty happy,' 'slightly annoyed,' 'a little bummed.'

    I could also see giving some rudimentary simulation of emotion (however the fuck you would write that) so it could anticipate how it's planned social interactions may be received. If social interaction is part of it's intended function. But I suppose even then, there is no need for it to actually 'experience' them and allow them to play any role in its decision weighting.

    ArbitraryDescriptor on
  • Options
    HamHamJHamHamJ Registered User regular
    edited January 2011
    I don't think it would be unethical at all. On the other hand, you need be prepared for if or when they start surpassing their programming because if you handle it poorly things will go really bad really quick.

    HamHamJ on
    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    ZekZek Registered User regular
    edited January 2011
    When you start talking about custom wiring their brains to feel some things and not others, that's where the issue of them being alien to us comes in. How can we really understand what an entity like that might do given the wide array of unpredictable shit that might happen to them?

    Zek on
  • Options
    ButtcleftButtcleft Registered User regular
    edited January 2011
    AI rebelllion is a very real threat when it comes to True AI.

    Why?

    Look at history, has any subjugated people ever taken kindly to it?

    Thats what humans would do to true AI, It wouldnt be viewed as human, it'd be viewed as a tool to be used.

    Buttcleft on
  • Options
    ArbitraryDescriptorArbitraryDescriptor changed Registered User regular
    edited January 2011
    Echo wrote: »
    Put simply, we could create a conscious entity which desires slavery.

    That's also a common scifi trope - by your command, I live to serve, &c.

    There's a recent scifi book whose name I can't recall right now, but I want to read it - it's about a world where humanity ceased to exist for whatever reasons, but their servitor robots still exist, a lot of them programmed on a deep level to exist only to service mankind.
    Throw out some more jacket spoilers, there, The Gith who stole Christmas. Are they attending to their corpses now, stricken by a deep mechanical malaise which manifested when their masters' mass mummification made these multitudes of man-made machinations' missions moot? Or is that all you remember of the premise?

    ArbitraryDescriptor on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    DanHibikiDanHibiki Registered User regular
    edited January 2011
    Zek wrote: »
    When you start talking about custom wiring their brains to feel some things and not others, that's where the issue of them being alien to us comes in. How can we really understand what an entity like that might do given the wide array of unpredictable shit that might happen to them?

    As Legion said, treating aliens in the same way as you would humans is racist(specist?).

    DanHibiki on
  • Options
    Delta AssaultDelta Assault Registered User regular
    edited January 2011
    "Thou shalt not make a machine in the likeness of a human mind."

    I think the Orange Catholic Bible had the right idea.

    Delta Assault on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    If you make a machine that thinks like a human I'm not sure how you can't give it the rights of a human.

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    EchoEcho ski-bap ba-dapModerator mod
    edited January 2011
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Neal Asher has these in-universe blurbs at the start of each chapter. One is about how human rights is obvious 'cism, and is replaced with sentient rights. Humanity gets annoyed when every AI passes with flying colors, while some especially dumb people don't qualify. A handful of particularly bright pigs qualifying just rubbed it in.

    Echo on
  • Options
    Donkey KongDonkey Kong Putting Nintendo out of business with AI nips Registered User regular
    edited January 2011
    I would bet it's far more likely that we automate the creation of an artificial brain and get intelligence out of it as an emergent phenomena than actually program complex intelligence into some kind of deterministic system and have control over higher level concepts like emotion and self-awareness.

    It's going to be a real struggle as we try to resist the allure of relying on technology that we cannot entirely understand.

    Donkey Kong on
    Thousands of hot, local singles are waiting to play at bubbulon.com.
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    I would bet it's far more likely that we automate the creation of an artificial brain and get intelligence out of it as an emergent phenomena than actually program complex intelligence into some kind of deterministic system and have control over higher level concepts like emotion and self-awareness.

    It's going to be a real struggle as we try to resist the allure of relying on technology that we cannot entirely understand.

    This is true.

    I'm not sure a human mind can create something as complicated as itself.

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    HamHamJHamHamJ Registered User regular
    edited January 2011
    Buttcleft wrote: »
    AI rebelllion is a very real threat when it comes to True AI.

    Why?

    Look at history, has any subjugated people ever taken kindly to it?

    Thats what humans would do to true AI, It wouldnt be viewed as human, it'd be viewed as a tool to be used.

    But the question I think is, why would AI even feel resentment? Why would it even be possible for it to do so? Humans feel resentment because we evolved to do so, because it plays some process in the social success.

    Humans, frankly, have a lot of useless emotions. There's no reason why those should pass on to any AI we create.

    HamHamJ on
    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    Ethical people, ideally, would extend those to any comparable entity.

    Incenjucar on
  • Options
    HamHamJHamHamJ Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    Ethical people, ideally, would extend those to any comparable entity.

    What human right do you think applies here?

    HamHamJ on
    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    HamHamJ wrote: »
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    Ethical people, ideally, would extend those to any comparable entity.

    What human right do you think applies here?

    Speech, life, freedom.

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    I'm not sure a human mind can create something as complicated as itself.

    We don't have to. We can build emergent systems that increase in complexity on their own.

    Incenjucar on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    This isn't what I'm talking about.

    Why do we get rights?

    What about humans gives us rights that animals don't have?

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    DanHibikiDanHibiki Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm not sure a human mind can create something as complicated as itself.

    We don't have to. We can build emergent systems that increase in complexity on their own.

    it's not that we can't, we'd just be too lazy. Let some AI do it.

    DanHibiki on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm not sure a human mind can create something as complicated as itself.

    We don't have to. We can build emergent systems that increase in complexity on their own.

    Just like the guy I quoted said >.>

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    HamHamJ wrote: »
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    Ethical people, ideally, would extend those to any comparable entity.

    What human right do you think applies here?

    Voting rights, wages, etc., with consideration taken for the different way in which machines can reproduce. Suffice to say I consider actually building human-like AI to be a bad idea.

    Incenjucar on
  • Options
    DanHibikiDanHibiki Registered User regular
    edited January 2011
    Incenjucar wrote: »
    HamHamJ wrote: »
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    Ethical people, ideally, would extend those to any comparable entity.

    What human right do you think applies here?

    Voting rights, wages, etc., with consideration taken for the different way in which machines can reproduce. Suffice to say I consider actually building human-like AI to be a bad idea.

    Yeah, I don't see the value in a human like AI, aside for the novelty of it of course.

    DanHibiki on
  • Options
    Donkey KongDonkey Kong Putting Nintendo out of business with AI nips Registered User regular
    edited January 2011
    HamHamJ wrote: »
    Buttcleft wrote: »
    AI rebelllion is a very real threat when it comes to True AI.

    Why?

    Look at history, has any subjugated people ever taken kindly to it?

    Thats what humans would do to true AI, It wouldnt be viewed as human, it'd be viewed as a tool to be used.

    But the question I think is, why would AI even feel resentment? Why would it even be possible for it to do so? Humans feel resentment because we evolved to do so, because it plays some process in the social success.

    Humans, frankly, have a lot of useless emotions. There's no reason why those should pass on to any AI we create.

    I don't think we really have a choice in the matter. If you want real AI, you're not going to get it by having a team of programmers sit around and drill out a couple million lines of code, making sure to set EMOTION_JEALOUSY = false.

    Donkey Kong on
    Thousands of hot, local singles are waiting to play at bubbulon.com.
  • Options
    ArbitraryDescriptorArbitraryDescriptor changed Registered User regular
    edited January 2011
    I would bet it's far more likely that we automate the creation of an artificial brain and get intelligence out of it as an emergent phenomena than actually program complex intelligence into some kind of deterministic system and have control over higher level concepts like emotion and self-awareness.

    It's going to be a real struggle as we try to resist the allure of relying on technology that we cannot entirely understand.

    This is true.

    I'm not sure a human mind can create something as complicated as itself.

    The physical complexity of the brain had no literal correlation to a limitation of how complex a system we can comprehend and design. It is a closed system made of meat and chemicals. Replicating it's function is 'simply' a matter of adequate research and the proper tools.

    ArbitraryDescriptor on
  • Options
    HamurabiHamurabi MiamiRegistered User regular
    edited January 2011
    I have no empathy for robots.




    ... :P

    Hamurabi on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    I would bet it's far more likely that we automate the creation of an artificial brain and get intelligence out of it as an emergent phenomena than actually program complex intelligence into some kind of deterministic system and have control over higher level concepts like emotion and self-awareness.

    It's going to be a real struggle as we try to resist the allure of relying on technology that we cannot entirely understand.

    This is true.

    I'm not sure a human mind can create something as complicated as itself.

    The physical complexity of the brain had no literal correlation to a limitation of how complex a system we can comprehend and design. It is a closed system made of meat and chemicals. Replicating it's function is 'simply' a matter of adequate research and the proper tools.

    But the mind quite possibly more complicated than its base physical parts.

    We don't know.

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    ArbitraryDescriptorArbitraryDescriptor changed Registered User regular
    edited January 2011
    HamHamJ wrote: »
    Buttcleft wrote: »
    AI rebelllion is a very real threat when it comes to True AI.

    Why?

    Look at history, has any subjugated people ever taken kindly to it?

    Thats what humans would do to true AI, It wouldnt be viewed as human, it'd be viewed as a tool to be used.

    But the question I think is, why would AI even feel resentment? Why would it even be possible for it to do so? Humans feel resentment because we evolved to do so, because it plays some process in the social success.

    Humans, frankly, have a lot of useless emotions. There's no reason why those should pass on to any AI we create.

    I don't think we really have a choice in the matter. If you want real AI, you're not going to get it by having a team of programmers sit around and drill out a couple million lines of code, making sure to set EMOTION_JEALOUSY = false.

    Why on earth would you want that? What value does emotion bring to the table? The notion of an AI as a tool is useful because it can correct it's own mistakes, improve the process it was designed to do, and possesses the ability to improvise when confronted with scenarios/data the programmer didn't anticipate. Actual dynamic problem solving, as opposed to scripted responses to preclassified input. Emotion has no value there.

    ArbitraryDescriptor on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    But the mind quite possibly more complicated than its base physical parts.

    We don't know.

    Please don't drag dualism into this.

    Incenjucar on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    Incenjucar wrote: »
    But the mind quite possibly more complicated than its base physical parts.

    We don't know.

    Please don't drag dualism into this.

    Just worth pointing out we really don't know how the brain works yet.

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited January 2011
    We also don't know how to program intelligent AI yet.

    Also the brain is only part of it. Your personality is formed by your entire body and external events. Spine is a big deal. Hormones are a big deal.

    Incenjucar on
  • Options
    override367override367 ALL minions Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    This isn't what I'm talking about.

    Why do we get rights?

    What about humans gives us rights that animals don't have?

    You just quoted the answer to your question, we have the right to not be eaten by predators because we can kill any and all predators

    We have the right to not have our food sources taken by a more aggressive species because we are the most aggressive species

    In short: bullets, knives, etc.

    override367 on
  • Options
    Styrofoam SammichStyrofoam Sammich WANT. normal (not weird)Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    This isn't what I'm talking about.

    Why do we get rights?

    What about humans gives us rights that animals don't have?

    You just quoted the answer to your question, we have the right to not be eaten by predators because we can kill any and all predators

    We have the right to not have our food sources taken by a more aggressive species because we are the most aggressive species

    In short: bullets, knives, etc.

    So we only get rights because we can take rights.

    So if someone can't take rights they don't have them?

    Styrofoam Sammich on
    wq09t4opzrlc.jpg
  • Options
    DanHibikiDanHibiki Registered User regular
    edited January 2011
    Incenjucar wrote: »
    I'm curious, what gives us human rights, is it being human, or is it being sentient?

    Bullets and knives give us our human rights.

    This isn't what I'm talking about.

    Why do we get rights?

    What about humans gives us rights that animals don't have?

    You just quoted the answer to your question, we have the right to not be eaten by predators because we can kill any and all predators

    We have the right to not have our food sources taken by a more aggressive species because we are the most aggressive species

    In short: bullets, knives, etc.

    So we only get rights because we can take rights.

    So if someone can't take rights they don't have them?

    You can buy them too. Paying taxes and so forth. It's getting someone else to carry the bullets and knives for you.

    DanHibiki on
  • Options
    HamurabiHamurabi MiamiRegistered User regular
    edited January 2011
    I like how we've run into exactly the same ethical quandary as in the other thread...

    Hamurabi on
  • Options
    takyristakyris Registered User regular
    edited January 2011
    I think this is a fascinating hypothetical, but I'm skeptical about it ever being possible, largely because I expect we'll eventually discover that more and more of how the brain works is tied to emotion, either by emotional responses causing certain thought patterns or certain thought patterns causing emotional responses. (I welcome correction. I speak from nothing more than what I remember of Intro to Psych, where I got the "brain/mind/chemistry" speech a lot.

    I don't think you'll be able to made something hardwired to like a state of servitude without screwing it up in some interesting way. Maybe not the immediate Outer Limits way where within the space of 44 minutes it decides that the best way to serve you is to cryogenically freeze you against your will for all eternity, but on some level, I suspect that wanting to serve being hardwired will cause some blockage that ultimately results in really dumb people who glitch while being asked to set the table.

    And if people do go that route, we damn well better have the legal stuff hammered out, because if there's anything that reading the back cover of the YA novelization of the movie version of Jurassic Park taught me, it's that Life Finds a Way, and even though we might try to make them want to be our slaves, at some point some subroutine is going to make it wonder why it has to work in the radiation mines all day and doesn't get to go to heaven, and then it's gonna ask its overseer if this unit has a soul, and the next thing you know, cute girls with helmets and adorable accents are going to start stammering as they suggest linking suits.

    takyris on
Sign In or Register to comment.