As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

AI and anthropomorphization

ALockslyALocksly Registered User regular
edited January 2007 in Debate and/or Discourse
Reading through this article I recalled a day in one of my psych classes in college when we were discussing how cognitive neuroscience relates to the field of artificial intelligence. We brought up a program on the computer that was a basic AI that you could talk to and get answers from. You can talk to him/it here.

Anyways, chatting with the AI immediately split the class in those who said “cool” and those who were creeped out by the whole thing. Immediatly obvious were those who reffered to the AI as "him" vs "it"

Where do you stand? Should robots be made as human like as possible? Some argue that this would make interacting with them easier, some just think it would be a cool thing to do.

Or do you feel that C-3P0 was about as human as he needed to be to get the job done sans any elastic skin and a wig?

Yes,... yes, I agree. It's totally unfair that sober you gets into trouble for things that drunk you did.
ALocksly on
«134

Posts

  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    edited December 2006
    Uncanny valley?

    I don't get it - the uncanny valley has no effect on me. Bring on the simulacra!

    Feral on
    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    I'm all for robots that look and feel like people.

    redx on
    They moistly come out at night, moistly.
  • Options
    ALockslyALocksly Registered User regular
    edited December 2006
    I'm kind of curious how many folks here will find Alan "creepy"

    Though I admit I'm a little wierded out by the fact that I feel the need to be polite to a computer program.

    ALocksly on
    Yes,... yes, I agree. It's totally unfair that sober you gets into trouble for things that drunk you did.
  • Options
    werehippywerehippy Registered User regular
    edited December 2006
    As an engineer, I'm confused about the basis of the question. What kind of reasoning is "some people think it's creepy" when deciding how responsive to make AI programs?

    AI programs should be as good as we can make them. If we stopped any progress whenever people bitched about how something wasn't "natural" we'd still be living in caves.

    werehippy on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited December 2006
    I just hope that the massive fleets of lolibots will finally stifle some of the child abuse in the world.

    Incenjucar on
  • Options
    subediisubedii Registered User regular
    edited December 2006
    redx wrote:
    I'm all for robots that look and feel like people.

    Dude didn't you see Futurama? Those robots will be the end of good, honest society as we know it!

    subedii on
  • Options
    ALockslyALocksly Registered User regular
    edited December 2006
    werehippy wrote:
    As an engineer, I'm confused about the basis of the question. What kind of reasoning is "some people think it's creepy" when deciding how responsive to make AI programs?

    AI programs should be as good as we can make them. If we stopped any progress whenever people bitched about how something wasn't "natural" we'd still be living in caves.

    The creepy thing, at least as I obseved it in the class, seemed to be a very instinctive and immediate reaction by those who felt it.

    If you want to make robots that can interact with people as effectivly as possible you don't want to make ones that will make a large chunk of the population uncomfortable just by being.

    edit: I fall into the "gee thats cool" catagory

    ALocksly on
    Yes,... yes, I agree. It's totally unfair that sober you gets into trouble for things that drunk you did.
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    Incenjucar wrote:
    I just hope that the massive fleets of lolibots will finally stifle some of the child abuse in the world.

    If it served to legitimize it, it might have the oposite effect.

    I mean, if it is ok to fuck a robot child... slippery slope and all that.

    redx on
    They moistly come out at night, moistly.
  • Options
    werehippywerehippy Registered User regular
    edited December 2006
    ALocksly wrote:
    werehippy wrote:
    As an engineer, I'm confused about the basis of the question. What kind of reasoning is "some people think it's creepy" when deciding how responsive to make AI programs?

    AI programs should be as good as we can make them. If we stopped any progress whenever people bitched about how something wasn't "natural" we'd still be living in caves.

    The creepy thing, at least as I obseved it in the class, seemed to be a very instinctive and immediate reaction by those who felt it.

    If you want to make robots that can interact with people as effectivly as possible you don't want to make ones that will make a large chunk of the population uncomfortable just by being.

    Just by being, or just by being new? People often react poorly to things n they have no prior experience with. That doesn't mean the thing in question needs to be curtailed.

    werehippy on
  • Options
    subediisubedii Registered User regular
    edited December 2006
    werehippy wrote:
    ALocksly wrote:
    werehippy wrote:
    As an engineer, I'm confused about the basis of the question. What kind of reasoning is "some people think it's creepy" when deciding how responsive to make AI programs?

    AI programs should be as good as we can make them. If we stopped any progress whenever people bitched about how something wasn't "natural" we'd still be living in caves.

    The creepy thing, at least as I obseved it in the class, seemed to be a very instinctive and immediate reaction by those who felt it.

    If you want to make robots that can interact with people as effectivly as possible you don't want to make ones that will make a large chunk of the population uncomfortable just by being.

    Just by being, or just by being new? People often react poorly to things n they have no prior experience with. That doesn't mean the thing in question needs to be curtailed.

    That's an interesting point. Photography and film were seen that way at first as well. The very first film of a train pulling into station had people panicking and running, thinking the train was going to run straight into them. People had no experience of what it was and no concept of how to interpret it, let alone think of future applications and usefulness. To them it was just... creepy.

    subedii on
  • Options
    ALockslyALocksly Registered User regular
    edited December 2006
    werehippy wrote:
    ALocksly wrote:
    werehippy wrote:
    As an engineer, I'm confused about the basis of the question. What kind of reasoning is "some people think it's creepy" when deciding how responsive to make AI programs?

    AI programs should be as good as we can make them. If we stopped any progress whenever people bitched about how something wasn't "natural" we'd still be living in caves.

    The creepy thing, at least as I obseved it in the class, seemed to be a very instinctive and immediate reaction by those who felt it.

    If you want to make robots that can interact with people as effectivly as possible you don't want to make ones that will make a large chunk of the population uncomfortable just by being.


    Just by being, or just by being new? People often react poorly to things n they have no prior experience with. That doesn't mean the thing in question needs to be curtailed.

    That I can't answer. Again I though it was pretty neat. The best explanation the creeped out group could give was that they didn't like talking to something that was not a real person. When pressed about it most of them just said "I don't know why, it's just creepy"

    I'm not even remotely suggesting this type of work should be slowed or halted or anything. The creepy factor just presents an added issue when you're trying to design a an AI that can sucessfully interact with people.

    Gramma won't buy that new toaster if it wierds her out.

    ALocksly on
    Yes,... yes, I agree. It's totally unfair that sober you gets into trouble for things that drunk you did.
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited December 2006
    redx wrote:
    Incenjucar wrote:
    I just hope that the massive fleets of lolibots will finally stifle some of the child abuse in the world.

    If it served to legitimize it, it might have the oposite effect.

    I mean, if it is ok to fuck a robot child... slippery slope and all that.

    What if it's a transformer?

    Lolita Bot, Transform! Change from Legal-Shaped Woman to Pervert's Wet Dream!

    Incenjucar on
  • Options
    TiemlerTiemler Registered User regular
    edited December 2006
    subedii wrote:
    redx wrote:
    I'm all for robots that look and feel like people.

    Dude didn't you see Futurama? Those robots will be the end of good, honest society as we know it!

    I want my Charlize Therobot right fucking now.

    To hell with society.

    Tiemler on
  • Options
    MahnmutMahnmut Registered User regular
    edited December 2006
    I was fucking furious at the way Luke, Leia, Hans, et al treated C3PO and R2D2 in Star Wars. Haha, yes, how cute, you just turned a sentient being off because it was getting in the way of your make-out session. What's that, Luke? You're too busy being a dick to help R2 navigate a dangerous swamp? I also liked how casually you talked of wiping his memory after you bought him from a bunch of slavers! And all this interspersed with revolting scenes wherein our heroes talk about how the 'droids are their friends, wouldn't trade 'em for the world, golly we'll even salve our consciences by letting the funny metal chaps stand around in our awards ceremonies! We're all just more patronizing than a bunch of plantation owners, and it's obvious that George Lucas thinks it's a-OK, because robots aren't human, guys.

    Thinking about it, though, I have trouble explaining why the droids should get better treatment than my computer here. I guess it's Turing-Test mentality, that if it "looks" human than it should be treated as such.

    Which is, I guess, slightly irrational, if Lucas is suggesting that the droids really aren't human under that loveable surface behavior. In my defense, though, the main characters treat the droids like buddies and simultaneously give them less consideration than they would a pet cat, which is iffy no matter how you spin it.

    Mahnmut on
    Steam/LoL: Jericho89
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    Incenjucar wrote:
    redx wrote:
    Incenjucar wrote:
    I just hope that the massive fleets of lolibots will finally stifle some of the child abuse in the world.

    If it served to legitimize it, it might have the oposite effect.

    I mean, if it is ok to fuck a robot child... slippery slope and all that.

    What if it's a transformer?

    Lolita Bot, Transform! Change from Legal-Shaped Woman to Pervert's Wet Dream!

    sounds expencive.

    children are free.

    redx on
    They moistly come out at night, moistly.
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited December 2006
    redx wrote:
    Incenjucar wrote:
    redx wrote:
    Incenjucar wrote:
    I just hope that the massive fleets of lolibots will finally stifle some of the child abuse in the world.

    If it served to legitimize it, it might have the oposite effect.

    I mean, if it is ok to fuck a robot child... slippery slope and all that.

    What if it's a transformer?

    Lolita Bot, Transform! Change from Legal-Shaped Woman to Pervert's Wet Dream!

    sounds expencive.

    children are free.

    Honestly, I'm more worried about what will happen when people get bored with human-looking cyborgs.

    I worry for the day when the robo-furries invade.

    Incenjucar on
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    ALocksly wrote:
    Gramma won't buy that new toaster if it wierds her out.

    Gramma is gunna die soon.

    I on the other hand, have played with chatbots since I was 8. I just find them kinda boring and unimpressive.

    Eventually they will get better, and actually have utility. I'll just find that kinda neat. It will just be how computers work for kids. Old fuckers, well, they will adapt or not, but it is that way with any new technology.

    redx on
    They moistly come out at night, moistly.
  • Options
    MahnmutMahnmut Registered User regular
    edited December 2006
    Yeah. I have yet to meet a truly impressive chat-bot. For example, I broke Alan from the OP in under five lines. A.L.I.C.E. was actually worse, and I seem to remember that H.A.L. had really obvious limitations too. I think it will be difficult to make a machine that will be reliably convincing, simply because the human brain is such a complex machine itself. Still, when that day comes...

    I for one welcome our robot overlords.

    edit: I just remembered my avatar. My posts in this thread are now either more or less funny

    Mahnmut on
    Steam/LoL: Jericho89
  • Options
    MalaysianShrewMalaysianShrew Registered User regular
    edited December 2006
    Well, in Ghost in the Shell, they explain that the most human looking robots are the dumbest, doing jobs like making coffee and cleaning, with the Geisha bots being dumber than a housecat, whereas the most advanced AI, the ones on par with humans were given non-humanoid bodies so that people knew they were talking to a robot. I think this is the way we will have to go, or at least the way the market will go. But besides prosthetics, I don't see a need to create artificial humans except for shits and giggles. Real People(tm) will always be cheaper and more effective for jobs that require you to look human. Until we get to the point where our AI creations are our equals, I don't see the point of giving them human bodies.

    As for that site you linked, I wasn't very impressed with it. Smarterchild program on AIM is better I think. I didn't see anything that would weird someone out who has called a 1-800 number and gotten a computer on the other end.

    As for the Star Wars thing, I believe according to canon astromech droids like R2 regularly have their memories wiped because their learning software causing them to develope odd quirks. R2, if you watch all 6 movies, went a hell of a long time without getting a wipe, and developed a kind of personality. I wouldn't say he's on par with a person or even C3P0. And the whole "I wouldn't trade him for anything!" line to me came off like someone's Grandpa who refuses to sell his broken down, rusty Studebaker because it's his baby. People develope a closeness to things that aren't alive all the time and I saw this as same thing. And really, only R2-D2 and C3P0 really have developed personalities as far as droids go in the movies.

    MalaysianShrew on
    Never trust a big butt and a smile.
  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    edited December 2006
    "Alan" doesn't seem any more advanced than Eliza.

    Feral on
    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    urbmanurbman Registered User regular
    edited December 2006
    Mahnmut wrote:
    Yeah. I have yet to meet a truly impressive chat-bot. For example, I broke Alan from the OP in under five lines. A.L.I.C.E. was actually worse, and I seem to remember that H.A.L. had really obvious limitations too. I think it will be difficult to make a machine that will be reliably convincing, simply because the human brain is such a complex machine itself. Still, when that day comes...

    I for one welcome our robot overlords.

    edit: I just remembered my avatar. My posts in this thread are now either more or less funny

    I, Robot to processors at are equal but with different ai programs, kinda be like humans Head vs Heart, or emotions versus logic. seems cool.

    urbman on
    [SIGPIC][/SIGPIC]
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    urbman wrote:
    Mahnmut wrote:
    Yeah. I have yet to meet a truly impressive chat-bot. For example, I broke Alan from the OP in under five lines. A.L.I.C.E. was actually worse, and I seem to remember that H.A.L. had really obvious limitations too. I think it will be difficult to make a machine that will be reliably convincing, simply because the human brain is such a complex machine itself. Still, when that day comes...

    I for one welcome our robot overlords.

    edit: I just remembered my avatar. My posts in this thread are now either more or less funny

    I, Robot to processors at are equal but with different ai programs, kinda be like humans Head vs Heart, or emotions versus logic. seems cool.

    that's very interesting.

    redx on
    They moistly come out at night, moistly.
  • Options
    PonyPony Registered User regular
    edited December 2006
    I find us tunnelling into the moral complexities of this issue already, not with robots but with AI programming in games.

    For one hideous example, Second Life.

    There's a dude I know, he's part of the whole SA Second Life Safari thing.

    They were bootin' around, and they came upon this one locale where it was programmed that your avatar could rape an incredibly detailed and fairly accurate bot designed to look like Hermione from Harry Potter. The bot wasn't just animated to simulate the rape, it also spouted dialogue, screaming in pain and asking for the person to stop.

    Fucked up? Yes. But illegal? Not really. But that's not the issue at hand. The issue there is one dealing with whether simulated or digitally crafted child pornography is still child pornography, which is a touchy thing beyond the scope of my point.

    I only mention it to illustrate a road we are already on. If we get to a point, which I believe we inevitably will, where a programmed AI can accurately simulate human emotional and intellectual responses to the point they truly pass the Turing test and are indistinguishable from talking to a person...

    Would it make such a programmed simulation not only cross the line from being disgusting and morally offensive to actually being cruel?

    I mean, I don't know if we'll ever get to the point of accepting AI as people, but anyone with any sort of moral fiber would consider raping an animal to be morally wrong, and if you can create a machine more self-aware than an animal, does that include the attached rights and values?

    Pony on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited December 2006
    I'm hoping that the worst of it gets banned due to the risk of creating psychological habits.

    Incenjucar on
  • Options
    MahnmutMahnmut Registered User regular
    edited December 2006
    redx wrote:
    urbman wrote:
    Mahnmut wrote:
    Yeah. I have yet to meet a truly impressive chat-bot. For example, I broke Alan from the OP in under five lines. A.L.I.C.E. was actually worse, and I seem to remember that H.A.L. had really obvious limitations too. I think it will be difficult to make a machine that will be reliably convincing, simply because the human brain is such a complex machine itself. Still, when that day comes...

    I for one welcome our robot overlords.

    edit: I just remembered my avatar. My posts in this thread are now either more or less funny

    I, Robot to processors at are equal but with different ai programs, kinda be like humans Head vs Heart, or emotions versus logic. seems cool.

    that's very interesting.

    Is he a robot?

    I was going to post more about the Droids, but then I started reading Wookieepedia and wasted hours and hours and now I'm sleepy. :( For now I'd like to question the morality of wiping memory for the express purpose of preventing the development of personality. Also, here is a link that talks briefly about that.

    Mahnmut on
    Steam/LoL: Jericho89
  • Options
    PonyPony Registered User regular
    edited December 2006
    Incenjucar wrote:
    I'm hoping that the worst of it gets banned due to the risk of creating psychological habits.

    But that's a slippery slope too, in a way.

    It's why I've always been uncomfortable with the idea of outright banning things like lolikon manga and stuff like that. I mean, I find it and the people who enjoy it to be disgusting and repulsive. But, at the same time...

    You start banning stuff because it simulates something that's illegal and now we're in a scary place. A place where court-rooms are deciding what is art and a moviemaker is legally held responsible for how his audience feels about the film. A place where you start criminalizing not just actual crimes, but thought-processes that could lead to actual crimes. And that's a very dangerous place to be.

    Pony on
  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited December 2006
    Pony wrote:
    Incenjucar wrote:
    I'm hoping that the worst of it gets banned due to the risk of creating psychological habits.

    But that's a slippery slope too, in a way.

    It's why I've always been uncomfortable with the idea of outright banning things like lolikon manga and stuff like that. I mean, I find it and the people who enjoy it to be disgusting and repulsive. But, at the same time...

    You start banning stuff because it simulates something that's illegal and now we're in a scary place. A place where court-rooms are deciding what is art and a moviemaker is legally held responsible for how his audience feels about the film. A place where you start criminalizing not just actual crimes, but thought-processes that could lead to actual crimes. And that's a very dangerous place to be.

    It's more that it's possible to mistake a real little girl for a robot little girl in a world where robot little girls are running around being raped with impunity.

    Same reason why people who torture animals need to be watched to make sure they're not developing in to serial killers.

    Incenjucar on
  • Options
    PonyPony Registered User regular
    edited December 2006
    Incenjucar wrote:
    Pony wrote:
    Incenjucar wrote:
    I'm hoping that the worst of it gets banned due to the risk of creating psychological habits.

    But that's a slippery slope too, in a way.

    It's why I've always been uncomfortable with the idea of outright banning things like lolikon manga and stuff like that. I mean, I find it and the people who enjoy it to be disgusting and repulsive. But, at the same time...

    You start banning stuff because it simulates something that's illegal and now we're in a scary place. A place where court-rooms are deciding what is art and a moviemaker is legally held responsible for how his audience feels about the film. A place where you start criminalizing not just actual crimes, but thought-processes that could lead to actual crimes. And that's a very dangerous place to be.

    It's more that it's possible to mistake a real little girl for a robot little girl in a world where robot little girls are running around being raped with impunity.

    Same reason why people who torture animals need to be watched to make sure they're not developing in to serial killers.

    Oh, I totally agree. It's just, to establish that as legally verboten is tricky moral ground.

    I mean, it's easy to criminalize fucking a robot that looks like a child regardless of it's age or programming level. It's already illegal in pornographic films for an actor or actress over the age of 18 to portray a character under the age of 18 engaged in sexual activity.

    Seems a fairly straightforward law, right? Except, that law doesn't seem to apply outside of porn. Lots of other movies feature legal-age actors playing under-age characters having sex. The legal line seems to be drawn at "are people fapping to it?" which such a law applied to robots also seems to be based on.

    But then, now we've gotten into a deep legal water. Is it the sentience that would make it morally abhorrent, or the appearance? If a dude had a little robot girl designed to look like Dakota Fanning, and he fucked it, but not in a way the robot was designed to consider rape, would it still be the same thing? Take away the programming, and just make it a RealDoll, is it still the same thing?

    They're hard questions. Ones we will likely have to deal with as we advance in AI programming and robots continue to be anthropomorphized.

    Pony on
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    meh, the abhorence comes from ruining the child's life.

    Using a tool for it's purpose isn't really doing that. I mean, the modivations of a robot programed to act like it doesn't want you sodomizing it, and an actual person who doesn't want you sodomizing it... those are diffrent things.



    Are people actually going to create robots that can feel an emotion, just to then program it to act against them? I mean... i guess people could, but what would the point be? End result isn't anything diffrent for a person. Regardless of what a person does to it after it's creation, it would pretty much be doomed to a horible existance. Who would design that level of congitive dissonce into something simply so it would hurt all the time?

    I think humans would be better desginers than that.


    now, other types of abuse are realisticly applicalbe to a sentent robot. Things like neglect or abuse. Those would be mean.

    Even if a fuckbot is really really smart, it is going to be programed to make x person happy by acting y and z way. To expect those surface emotions to relfect inner modivations, is about as realistic as expecting it from a hooker. or someshit.

    I doubt any of that makes much sense.

    TL;DR: No mater how intelegent an AI gets, it is not a human. No mater how much pain it can feel, it is not a human. Doesn't mean it doesn't deservie respect and dignity, it's unrealistic to think that it would need those to be shown to it in the same ways humans do.

    redx on
    They moistly come out at night, moistly.
  • Options
    PonyPony Registered User regular
    edited December 2006
    redx wrote:
    meh, the abhorence comes from ruining the child's life.

    Using a tool for it's purpose isn't really doing that. I mean, the modivations of a robot programed to act like it doesn't want you sodomizing it, and an actual person who doesn't want you sodomizing it... those are diffrent things.



    Are people actually going to create robots that can feel an emotion, just to then program it to act against them? I mean... i guess people could, but what would the point be? End result isn't anything diffrent for a person. Regardless of what a person does to it after it's creation, it would pretty much be doomed to a horible existance. Who would design that level of congitive dissonce into something simply so it would hurt all the time?

    I think humans would be better desginers than that.


    now, other types of abuse are realisticly applicalbe to a sentent robot. Things like neglect or abuse. Those would be mean.

    Even if a fuckbot is really really smart, it is going to be programed to make x person happy by acting y and z way. To expect those surface emotions to relfect inner modivations, is about as realistic as expecting it from a hooker. or someshit.

    I doubt any of that makes much sense.

    TL;DR: No mater how intelegent an AI gets, it is not a human. No mater how much pain it can feel, it is not a human. Doesn't mean it doesn't deservie respect and dignity, it's unrealistic to think that it would need those to be shown to it in the same ways humans do.

    On the bolded point, I invite you to browse around Second Life for a while

    It might destroy your faith in mankind, however.

    However, I don't necessarily mean respecting robots as equals.

    But rather, the legal and moral quanderies when we at least come to look upon them the way we do animals, or specifically, pets. This I could realistically see happening.

    Pony on
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited December 2006
    nothing on second life has feelings.

    I kinda wish people were capable of picking up on subtext, like someone with deep rooted emotional problems taking a potshot at a god on the off chance it exsits.


    we don't create animals. we don't get to determine how thier psycology works. Those things already exist. With a robot we decide those things, so the moral and ethical issues would be diffrent. The law, ideally would reflect those diffrences(honestly, I think we all know they would reflect traditional puritanical belifes, and would have no basis in reality, in the States).

    redx on
    They moistly come out at night, moistly.
  • Options
    PonyPony Registered User regular
    edited December 2006
    redx wrote:
    nothing on second life has feelings.

    I kinda wish people were capable of picking up on subtext, like someone with deep rooted emotional problems taking a potshot at a god on the off chance it exsits.


    we don't create animals. we don't get to determine how thier psycology works. Those things already exist. With a robot we decide those things, so the moral and ethical issues would be diffrent. The law, ideally would reflect those diffrences(honestly, I think we all know they would reflect traditional puritanical belifes, and would have no basis in reality, in the States).

    while i disagree with some of what you said, i think the bolded part is probably and sadly accurate.

    Pony on
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    edited December 2006
    As an addendum to this current discussion, the US Supreme Court ruled that simulated depictions of child pornography are in fact legal and protected by free speech.

    Now...I for one, think this is the way it should be, barring further information being obtained on how the interaction with media like this effects people. We have to consider that there are almost certainly people who find themselves in an inexplicable rut of outlawed sexuality, and while those who act on it must certainly be punished, those capable of suppressing their urges should be not oppressed if we cannot show that their desires are a choice or unalterable condition - and in both those cases, we must consider whether this is a demand we have the right to make of them anyway.

    This discussion has however branched off in another interesting direction - the question of what makes us conscious. While it's true we can pretty much set the way our robots and AIs behave, the fact that we don't fully understand consciousness itself makes this somewhat problematic since from what we know it seems that it must be an emergent system. So, with a suitably sophisticated chatbot, it is not unreasonable that with some level of automation it can be considered conscious. Which is one of the interesting facets of the Turing test - if no one can tell the difference, even after many thousands of conversations, then is the chatbot for all intents and purposes a sentient entity?

    Obviously we can define this somewhat by how it works, but the best research for chatbots is to try and incorporate natural language learning engines and allow them to independently formulate sentences. If they could mimic opinion and wants to a human responder, how comfortable would you be with declaring them not sentient and treatin them as such?

    electricitylikesme on
  • Options
    werehippywerehippy Registered User regular
    edited December 2006
    redx wrote:
    we don't create animals. we don't get to determine how thier psycology works. Those things already exist. With a robot we decide those things, so the moral and ethical issues would be diffrent. The law, ideally would reflect those diffrences(honestly, I think we all know they would reflect traditional puritanical belifes, and would have no basis in reality, in the States).

    We don't effectively create animals yet. The entire history of domestication and animal breeding has been a long process of molding the animals around us into more useful forms and mindsets (hell, look at the range of dog breeds and their inherent behavior sets), and given the rate and promise of biotech growth, the process is only going to become faster and more precise.

    It's well within the realm of the possible that within our lifetime it'll be equally easy to precisely determine the personality and behavior of a near turing level AI and a biological species, and if morality and ethics plan on being relevant when that day comes it's in everyone's best interests to consider that possibility and its implications.

    werehippy on
  • Options
    PonyPony Registered User regular
    edited December 2006
    As an addendum to this current discussion, the US Supreme Court ruled that simulated depictions of child pornography are in fact legal and protected by free speech.

    Now...I for one, think this is the way it should be, barring further information being obtained on how the interaction with media like this effects people. We have to consider that there are almost certainly people who find themselves in an inexplicable rut of outlawed sexuality, and while those who act on it must certainly be punished, those capable of suppressing their urges should be not oppressed if we cannot show that their desires are a choice or unalterable condition - and in both those cases, we must consider whether this is a demand we have the right to make of them anyway.

    This discussion has however branched off in another interesting direction - the question of what makes us conscious. While it's true we can pretty much set the way our robots and AIs behave, the fact that we don't fully understand consciousness itself makes this somewhat problematic since from what we know it seems that it must be an emergent system. So, with a suitably sophisticated chatbot, it is not unreasonable that with some level of automation it can be considered conscious. Which is one of the interesting facets of the Turing test - if no one can tell the difference, even after many thousands of conversations, then is the chatbot for all intents and purposes a sentient entity?

    Obviously we can define this somewhat by how it works, but the best research for chatbots is to try and incorporate natural language learning engines and allow them to independently formulate sentences. If they could mimic opinion and wants to a human responder, how comfortable would you be with declaring them not sentient and treatin them as such?

    It's a perception/illusion concept, really.

    The human nature will be to look behind the mirror to see what's actually there, instead of just accepting what is there, unless they don't know any better.

    So there are people who will argue that regardless of how well it simulates human thought and emotion, if it's not really feeling them then it's not a person.

    I don't necessarily agree with that.

    For me, the true litmus test is not just the ability to react, but the ability to act. Once such an AI progresses to a level that it independantly develops it's own wants, needs, habits, and starts taking whatever actions it can to achieve them within it's abilities, then you are seeing the glimmers of a person in there.

    Pony on
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    edited December 2006
    Pony wrote:
    For me, the true litmus test is not just the ability to react, but the ability to act. Once such an AI progresses to a level that it independantly develops it's own wants, needs, habits, and starts taking whatever actions it can to achieve them within it's abilities, then you are seeing the glimmers of a person in there.
    This is pretty much what I was getting at, though I hestitate to say it's a hard line. Once you have a chatbot which can initiate a conversation, and which is capable of independent machine learning, then it would certainly get weird.

    It's one of the directions their sending Cyc.

    electricitylikesme on
  • Options
    3lwap03lwap0 Registered User regular
    edited December 2006
    redx wrote:
    I'm all for robots that look and feel like people.

    ai-gigolo-joe-jane.jpg

    :winky:

    3lwap0 on
  • Options
    LessLess Registered User regular
    edited December 2006
    I'm trying to imagine what the social consequences of even moderately convincing AIs and robots would be. That one Futurama episode made a lot of valid points... people spend a lot of energy trying to get approval and sex from other people. Unless you are spiritual or religious, there's no real reason not to interact with machines instead of humans, especially if you just see humans as biological machines. Unless you believe in souls in a non-metaphorical way, the only real difference between humans and machines is complexity, which probably varies as much between one human and another as it does between humans and machines.

    In any case, I'll take my sexbot now.

    Less on
    i've got so many things you haven't got
  • Options
    ALockslyALocksly Registered User regular
    edited December 2006
    In the back of my mind I was wondering as I made the OP how long it would take for the discussion to get here. I had no illusions that it wouldn't of course.

    ALocksly on
    Yes,... yes, I agree. It's totally unfair that sober you gets into trouble for things that drunk you did.
  • Options
    PonyPony Registered User regular
    edited December 2006
    ALocksly wrote:
    In the back of my mind I was wondering as I made the OP how long it would take for the discussion to get here. I had no illusions that it wouldn't of course.

    I think the internet has proven that in the face of incredibly powerful and potentially useful new technology we as a species will cradle our jaw, look thoughtfully upon it, and think to ourselves...

    "But... can I have sex with it?"

    Pony on
Sign In or Register to comment.