As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

Robots, AI, and how we treat them

1235

Posts

  • PonyPony Registered User regular
    edited January 2010
    Lanz wrote: »
    Also I forget: Pony, does the "sexbot," for lack of a less crude term, have any level of cognizance, even comparable to animals? or is it still just a simulation?

    For the purposes of the discussion, we're talking about pure simulations here, not even something with the cognizance level of the most basic animals.

    Pony on
  • KamarKamar Registered User regular
    edited January 2010
    So you're not saying the people using these dolls will eventually become real rapists, just that they won't care about real rape?

    I can say with some degree of confidence that someone with a harmful sexual deviancy who refuses to indulge themselves probably has an equal or higher level of intolerance for the harm it inflicts than the average joe.

    Kamar on
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    Kamar wrote: »
    I honestly can't wrap my head around what you're trying to argue here, Arch. An object doesn't feel, doesn't suffer, doesn't care. It can't be harmed in any meaningful way, not even the way animals can. The robots in question would unquestionably be robots; they would weakly resist, but because that resistance was what they were programmed for.

    Not because they don't want to be penetrated. They don't care; they can't!

    Even if we got to higher level machines, we'd probably not use them for this kind of thing, and if we did, we'd program them to enjoy it, despite the token physical resistance they put up; it'd be roleplay the same as humans do with each other now.

    Urg, yea let me be a bit more clear. At what point is it moral or ethical to damage ANY object simply to satisfy a personal desire? Yes the toaster cannot suffer or even feel pain, but needlessly destroying a toaster to fulfill a "destructive streak" is just a little bit unethical. Likewise, if we increase the capacity for an object to respond to our action (eg, robots) then the level of how unethical this is rises.

    If you don't agree that it is inherently a little bit wrong to destroy something just to satisfy yourself then my arguments will fall on deaf ears.

    Arch on
  • DasUberEdwardDasUberEdward Registered User regular
    edited January 2010
    Pony wrote: »
    _J_ wrote: »
    Dyscord wrote: »
    We're talking about something that is, for all intents and purposes, a really complicated toaster.

    Exactly. Toasters do not will. So, toasters cannot be raped. And, unless we are behaviorists, a toaster which is programmed to scream will not illicit an empathetic response from a human. This is because, again, toasters do not experience; toasters lack qualia; there is nothing which it is to be a toaster.

    To claim that there is a commonality between a human and a toaster is to be confused.

    And to react to horrified screaming from an accurate human facsimile subjected to something that (if done to humans) would constitute torture is a normal empathetic reaction to perceived suffering.

    If you can watch a robot go through all the motions of being in agony on a level that is a clear emulation of human behavior, if it looks and acts human, and you don't feel even the slightest twinge of instinctive response to that sensory input I think you're a little psychologically busted.

    There is imagery, sounds, fictional depictions of horrific acts that I not only do not enjoy watching, but actually find difficult to even tolerate. This gradient of tolerance and enjoyment is hardly objective, and is different for everyone.

    However, to have no gradient at all, I think, is probably indicative of some disturbing psychological behavior.

    The more accurate a simulation of a real thing something becomes, the more visceral and real the emotional responses from people are. This is a fact. Even if a person maintains continuous knowledge that it is a simulation, the closer the simulation gets to fully engaging in identical sensory input to the real act, emotion over-rides.

    You might find yourself "so what? It's just an emotional response" but emotional responses are important. They do, in fact, make up a part of what compels people to act ethically towards each other. If you blunt a person's emotional responses and empathy for others, you blunt them as a human being.

    You may not recognize the difference between a dog and a toaster, but that's on you, quite frankly. Other people don't equate damage inflicted upon the two. If I sit there and cut into a toaster with a saw, it doesn't provoke an emotional response from psychologically healthy people. If I do the same thing to a living and conscious puppy, it does provoke an emotional response from people who don't have diagnosable psychological disorders.

    Now, a robot that exhibits the outward reactions of a human isn't a dog, but it isn't quite a toaster either. That's what I'm getting at and what you and some people seem to be missing.

    There is, in fact, a meaningful difference that should be acknowledged. A dog isn't a toaster. A Sim isn't a dog. A robot isn't a Sim. None of these things are human, and none of these things are "people", but there is varying degrees to which intentionally damaging or harming them has an emotional response from people. Those degrees are quite different.

    If you can stand there, and carve into a howling dog with a reciprocating saw, and be amused or even aroused by the experience, then I think that's actually diagnostic criteria for a psychological illness.

    We certainly make the conscious distinction because it's a living, biological organism (which a toaster or a robot certainly isn't) but that's not the entirety of the thought process behind why we consider torturing animals for amusement to be of questionable ethics.

    People have emotional responses to things they consciously know are simulated because that is how the brain operates. On an emotional level, it doesn't always make that distinction because the sensory input is so identical.

    The more identical the sensory input, the more pronounced the emotional response. There is absolutely nothing wrong with that, and I think that's psychologically healthy.

    However, it is possible to blunt that response and to desensitize oneself to varying levels of simulated input via exposure. A person's instinctive, natural responses to a simulated experience can lose its emotional quality as the person's logical knowledge that the experience "isn't real" asserts itself over emotional response.

    That exposure can have many, many different levels, and the desensitizing process works best when it starts small and as remotely removed from an accurate depiction as possible, as you slowly ramp up the realism and allow the person's logical mind to assert itself over an emotional reaction.

    This would be a good thing... if it only applied to simulations. But the reality is, desensitizing people to having an emotional response to the real thing is something that can be done via simulation on some level. Not to the level that histrionic demagogues like Jack Thompson would have you believe, but certainly on a level that is measurable.

    It's my opinion that robotic simulation, physically emulating not only the human form but also human movement and emotional responses, in a way that can be fully interacted with in the context of the act being performed... that's several steps removed from a toaster, or a game of Grand Theft Auto.

    That level of simulation, that level of sensory immersion into a facsimile of acts that would be abominable to do to people, is not something that I feel is psychologically healthy. I feel it's a line that, perhaps, people are better off for not crossing.

    The fact that we seem to be approaching that line and are likely to see it crossed in our lifetime troubles me.

    I think, to some extent with a large swath of people, we are already so media-saturated with fictional depictions of human suffering, or clinically detached from real suffering via a television screen, that we've lost a great deal of human empathy and emotional response that we'd have been better off as a society to maintain.

    If you can flip on your TV, and see one of those manipulative African charity ads featuring starving children with flies buzzing around them, and you feel nothing, I think you've lost something along the way that perhaps you shouldn't have lost.


    Empathy is an absolute prerequisite to function on a basic level in human society. But oddly enough as we advance the need for basic empathy seems to decline. I can't say if it's proportionate to anything but if a person in a small tribe kills someone without reason or remorse it will go noticed and it will be punished. So in that relation empathy can almost be seen as a defense mechanism where we are unwilling to allow others to experience things that would be personally detrimental to our own being.

    Keeping that in mind it seems that the more impersonal the event is the easier it is for us to isolate it from our consciousness and more or less "turn it off". I think that is because there's little personal consequence and unfortunately our empathy may be rooted in self defense.

    DasUberEdward on
    steam_sig.png
  • Eat it You Nasty Pig.Eat it You Nasty Pig. tell homeland security 'we are the bomb'Registered User regular
    edited January 2010
    Arch wrote: »
    Kamar wrote: »
    I honestly can't wrap my head around what you're trying to argue here, Arch. An object doesn't feel, doesn't suffer, doesn't care. It can't be harmed in any meaningful way, not even the way animals can. The robots in question would unquestionably be robots; they would weakly resist, but because that resistance was what they were programmed for.

    Not because they don't want to be penetrated. They don't care; they can't!

    Even if we got to higher level machines, we'd probably not use them for this kind of thing, and if we did, we'd program them to enjoy it, despite the token physical resistance they put up; it'd be roleplay the same as humans do with each other now.

    Urg, yea let me be a bit more clear. At what point is it moral or ethical to damage ANY object simply to satisfy a personal desire? Yes the toaster cannot suffer or even feel pain, but needlessly destroying a toaster to fulfill a "destructive streak" is just a little bit unethical. Likewise, if we increase the capacity for an object to respond to our action (eg, robots) then the level of how unethical this is rises.

    If you don't agree that it is inherently a little bit wrong to destroy something just to satisfy yourself then my arguments will fall on deaf ears.

    What about an object whose purpose is to be destroyed?

    Eat it You Nasty Pig. on
    NREqxl5.jpg
    it was the smallest on the list but
    Pluto was a planet and I'll never forget
  • KamarKamar Registered User regular
    edited January 2010
    I don't see how it's unethical except for perhaps being wasteful. I mean, I stop myself from breaking things when I lose my temper because it would be wasteful and I don't want to pay to fix or replace them, not because it would be immoral.

    Of course, it'd be really fucking expensive if this act was breaking your sex doll, too. I don't see how that is implied in the situation; a doll designed and programmed for this kind of thing would probably be pretty durable.

    Kamar on
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    Kamar wrote: »
    I don't see how it's unethical except for perhaps being wasteful. I mean, I stop myself from breaking things when I lose my temper because it's wasteful, not because it's immoral.

    Of course, it'd be really fucking expensive if this act was breaking your sex doll, too. I don't see how that is implied in the situation; a doll designed and programmed for this kind of thing would probably be pretty durable.

    And being wasteful is not being immoral?

    I fear my points are getting slightly off topic

    EDIT: as to the "what if it is INTENDED to be destroyed" line- I will think about this

    Arch on
  • KamarKamar Registered User regular
    edited January 2010
    Only if there are consequences, which in the case of these sex dolls there wouldn't be; the act may be destructive but would not destroy/waste the doll.

    Kamar on
  • GammarahGammarah Registered User regular
    edited January 2010
    Didn't read any of this thread but the OP. I agree with everything that Pony said but I'm not sure that robots would ever be produced or marketed to be able to fully reproduce the Hostel eye-removing scenario, not for a long time anyway. Even then, it still seems like buying one of these robots would be an extremely expensive endeavor, and I'm not sure that a successful industry could ever be founded upon such a thing. So yes it could be a problem, but I doubt that something like this could permeate itself into mainstream society for a long time. People just aren't ready to deal with this shit.

    Gammarah on
  • _J__J_ Pedant Registered User, __BANNED USERS regular
    edited January 2010
    Arch wrote: »
    I can assure you that a watermelon most definitely "does not wish to be raped" if you take "rape" to mean "forced sexual encounter".

    Watermelons have intention / desire? How do you know this?

    _J_ on
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    _J_ wrote: »
    Arch wrote: »
    I can assure you that a watermelon most definitely "does not wish to be raped" if you take "rape" to mean "forced sexual encounter".

    Watermelons have intention / desire? How do you know this?

    If you are going to play that card, I will respond with "how do you know they DON'T?"

    Arch on
  • LanzLanz ...Za?Registered User regular
    edited January 2010
    Pony wrote: »
    Lanz wrote: »
    Also I forget: Pony, does the "sexbot," for lack of a less crude term, have any level of cognizance, even comparable to animals? or is it still just a simulation?

    For the purposes of the discussion, we're talking about pure simulations here, not even something with the cognizance level of the most basic animals.

    okay, thanks, just curious.

    but still: where does the actor's perceptions of the object acted upon fit in?

    Lanz on
    waNkm4k.jpg?1
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    Kamar wrote: »
    Only if there are consequences, which in the case of these sex dolls there wouldn't be; the act may be destructive but would not destroy/waste the doll.

    Maybe I also just don't see any sort of destructive act as coming from a base zero level of morality as well.

    Arch on
  • _J__J_ Pedant Registered User, __BANNED USERS regular
    edited January 2010
    Pony wrote: »
    And to react to horrified screaming from an accurate human facsimile subjected to something that (if done to humans) would constitute torture is a normal empathetic reaction to perceived suffering.

    The question to ask is whether one reacts to:

    A) Screaming
    B) The understood sensory experience of pain as occuring in another agent.


    My contention is that one reacts to the understanding of "That thing experiences as I do." So, for example, if a cat yelps that to which one reacts is not "yelp" but rather the understanding of "yelp" as indicative of pain.

    if one can know that "yelp" is not "indicative of pain" then one would not react to the yelp.

    If you contend that what one reacts to is "yelp" and not "indicative of pain" then, ok. But that seems quite odd.

    _J_ on
  • PonyPony Registered User regular
    edited January 2010
    Kamar wrote: »
    So you're not saying the people using these dolls will eventually become real rapists, just that they won't care about real rape?

    I can say with some degree of confidence that someone with a harmful sexual deviancy who refuses to indulge themselves probably has an equal or higher level of intolerance for the harm it inflicts than the average joe.

    I can't make any sense of your second sentence, the wording confuses the hell out of me.

    The first, however, yes that's basically what I'm saying.

    The odds of a person using a "rape-droid" becoming a real rapist is about as high as a person who already jerks it to simulated rape porn is.

    Which is to say, existent but also not really high enough to be worth real social consideration.

    However, what is far more disconcerting is the way accurate simulation impacts a person's ability to emotionally respond to the existence, presence of, or depiction of those acts for real.

    Here's a good example:

    In an episode of Penn & Teller's bullshit about violent video games, they took a 10 year old suburban white kid who plays a motherfuckton of First-Person Shooters out to a shooting range to fire an assault rifle.

    Hardly a scientific experiment, but it was used to illustrate a point: The Jack Thompsons of the world say that FPS games turn young boys into spree-killers and prepares them for how to shoot guns at people.

    This kid fired the rifle, once, put it down, didn't want to touch it again, and (in a scene shown during the credits) started crying in his mother's arms.

    This child, who has never shot a real gun before in his life, was startled and frightened by the noise, power, and over-all fucked up nature of the circumstance.

    He wasn't desensitized to actual gunfire nor did his video gaming prepare him for what it feels like to fire a real gun.

    So, you can safely say that whatever (not infinitesimal) desensitizing to simulated violence he might be receiving via video games, there's certainly a very hard cap on the transfer of that to the real deal.

    Yet... if the same child was accustomed to shooting guns at robots that bled fake blood, cried out in pain, squirmed in agony from non-fatal (but still crippling, by the robot's design) gunshot wounds... well, he's far more desensitized to not just gunfire, but how it affects people.

    Is he going to turn into a spree-killer? No, I don't think so.

    But is he now far more desensitized to real gunfire, real gunshot wounds, and actual combat than a kid who just plays Xbox? Yes, yes he very much is.

    Now, apply the same kind of circumstance to something like the sex-droids I've been talking about... you should be able to see what I mean there. I'm not saying using a rape-bot is gonna make a man a rapist... but it does damage his ability to have a negative emotional response to real rape.

    There are some people who try to make the counter-point that "Well, in order to use that sort of thing, you'd have to be desensitized to it in the first place! Normal people wouldn't!"

    Which doesn't really disprove my point, it only underlines that you agree that an empathetic human being should be having a negative emotional reaction to doing that sort of thing, even to a machine.

    But, as long as we say "It's just a machine, it doesn't matter", we're societally reinforcing the idea that the impact is irrelevant or non-existent and I think that makes it easier for people to justify the usage to themselves... which leads to the blunting and... etc.

    You see where I'm going with this, yeah?

    Pony on
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    _J_ wrote: »
    Pony wrote: »
    And to react to horrified screaming from an accurate human facsimile subjected to something that (if done to humans) would constitute torture is a normal empathetic reaction to perceived suffering.

    The question to ask is whether one reacts to:

    A) Screaming
    B) The understood sensory experience of pain as occuring in another agent.


    My contention is that one reacts to the understanding of "That thing experiences as I do." So, for example, if a cat yelps that to which one reacts is not "yelp" but rather the understanding of "yelp" as indicative of pain.

    if one can know that "yelp" is not "indicative of pain" then one would not react to the yelp.

    If you contend that what one reacts to is "yelp" and not "indicative of pain" then, ok. But that seems quite odd.

    I would believe that conditioning would like to state that we can respond to both of these things either independently or in tandem

    Arch on
  • KamarKamar Registered User regular
    edited January 2010
    I'm going to tentatively agree with _J_ here; I know when I hear canned screams in a movie or whatever I don't even think about it, but when I hear the real thing on the news or in real life it has a strong impact, even though the sensory input is identical.


    To clarify my second sentence, I mean a guy who feels the urge to rape but doesn't because he knows it is wrong probably has stronger feelings on the subject of rape being wrong than the average person.

    Kamar on
  • PonyPony Registered User regular
    edited January 2010
    _J_ wrote: »
    Pony wrote: »
    And to react to horrified screaming from an accurate human facsimile subjected to something that (if done to humans) would constitute torture is a normal empathetic reaction to perceived suffering.

    The question to ask is whether one reacts to:

    A) Screaming
    B) The understood sensory experience of pain as occuring in another agent.


    My contention is that one reacts to the understanding of "That thing experiences as I do." So, for example, if a cat yelps that to which one reacts is not "yelp" but rather the understanding of "yelp" as indicative of pain.

    if one can know that "yelp" is not "indicative of pain" then one would not react to the yelp.

    If you contend that what one reacts to is "yelp" and not "indicative of pain" then, ok. But that seems quite odd.

    J, I assure you that people with a neuroatypical emotional range do behave like this.

    I know you might not, but you're also the same person who can't understand how people stay depressed when they can just... choose not to be depressed.

    So, I think it's fair to say you don't have a typical understanding of how emotions work.

    Pony on
  • KamarKamar Registered User regular
    edited January 2010
    So Pony, you get all worked up over screams in movies? Or do you not feel anything when you see a news reel of a terrorist attack and hear the screams?

    Kamar on
  • ArchArch Neat-o, mosquito! Registered User regular
    edited January 2010
    Pony wrote: »
    Kamar wrote: »
    So you're not saying the people using these dolls will eventually become real rapists, just that they won't care about real rape?

    I can say with some degree of confidence that someone with a harmful sexual deviancy who refuses to indulge themselves probably has an equal or higher level of intolerance for the harm it inflicts than the average joe.

    There are some people who try to make the counter-point that "Well, in order to use that sort of thing, you'd have to be desensitized to it in the first place! Normal people wouldn't!"

    Which doesn't really disprove my point, it only underlines that you agree that an empathetic human being should be having a negative emotional reaction to doing that sort of thing, even to a machine.

    I am pulling this out because I think it is the crux of my argument as well.

    My other point is that the phrase below, if adopted as a culture, is a fairly dangerous view
    pony wrote:
    as long as we say "It's just a machine, it doesn't matter"

    NOTE: I do not believe a sawz-all, for example, matters as much as a human or indeed is even in the same CATEGORY. But there is a large difference between "Its just a machine, it doesn't matter as much as a human" and "Its just a machine, it doesn't matter AT ALL".

    Arch on
  • CasualCasual Wiggle Wiggle Wiggle Flap Flap Flap Registered User regular
    edited January 2010
    Violent video games have had a strange effect on me. They seem to have given me a blood thirsty love of fantasy violence but utter horror of the real thing.

    It's very fun disembowel electronic people while shouting "skulls for the skull throne!!" but for me its just reinforced how horrible and alien it would be to translate that into real life. For me it doesn't seem like behaviour that belongs in the real world any more than jet packs and aliens.

    Pony raised the interesting point of the line being blurred by having a situation of psudo-violence that is near as makes no difference real. Bottom line after mulling it over I think for me the underlying knowledge that it's fake would allow me to compartmentalise the experience into my "not in real life" head space.

    But as ever it depends on the person in question. The argument I always make to your average Jack Thompson is "A video game won't make someone violent, it could make someone who is already violent worse but then it is only one of a million stimuli that could set off that behavior". In other words simulated violence doesn't make people killers the enjoyment of the violence has to be there in the first place.

    I think the same argument holds true here. You would have to be a fairly messed up person to buy a robot to rip it eyes out and hear it scream to start with. Any normal person who for whatever reason tried it out would probably be turned off the experience.

    Casual on
  • DasUberEdwardDasUberEdward Registered User regular
    edited January 2010
    In what way does a machine matter that is not directly related to its ability to impact life?

    DasUberEdward on
    steam_sig.png
  • PonyPony Registered User regular
    edited January 2010
    Kamar wrote: »
    So Pony, you get all worked up over screams in movies? Or do you not feel anything when you see a news reel of a terrorist attack and hear the screams?

    See, there you go.

    You're acting like there's a giant thick binary line and not a gradient of simulation.

    How emotionally upsetting a simulation is capable of being is hinged on how accurate it is to the real event it is depicting (or, more importantly, how accurate it is to the expectation of the person observing or engaging in the simulation).

    Canned screams from a 1950's horror movie have no affect on me because they are so many steps removed from any sort of accurate depiction as my brain can process it that my emotional reaction is minimal (if not sorta bemused in a quaint way! Yesteryear's horror movies are quaint by today's standards specifically because how much we've progressed towards immersiveness and accuracy of the simulation).

    But let's take something modern, like Hostel: Hostel made me sick. I stopped watching it around the mid-point of the movie, because I found it emotionally unpleasant (and, quite frankly, boring from a story standpoint. It sorta defines the "torture-porn" genre).

    Why was my emotional response so powerful? Because of the accuracy of the depiction. Even though I know it's fake, consciously, the screams and the depicted violence and the gleeful psychopathic behavior of the perpetrators was well outside my comfort levels.

    Yet, do I think people who enjoyed the movie are my ethical inferiors? Certainly not. Gradients are relative, and different for everyone. I'm sure there's movies I enjoyed that are deeply disturbing to other people with far different emotional allowances for what they will accept in film.

    Physical simulation via robotics is, however, a rather huge leap in that level of simulation and I think that general acceptance of that level of simulated horror is... probably not good for people.

    There's limits to how much fiction should simulate life, in my opinion.

    Pony on
  • CasualCasual Wiggle Wiggle Wiggle Flap Flap Flap Registered User regular
    edited January 2010
    It depends if you believe that the torture robot buyers were already psychotic to begin with or if the access to torture robots made them that way.

    Theres also the argument that it's better for them to be getting it out their system on robots rather than their fellow humans. (Though I personally don't endorse or disagree with that theory)

    Casual on
  • PonyPony Registered User regular
    edited January 2010
    Casual wrote: »
    Violent video games have had a strange effect on me. They seem to have given me a blood thirsty love of fantasy violence but utter horror of the real thing.

    It's very fun disembowel electronic people while shouting "skulls for the skull throne!!" but for me its just reinforced how horrible and alien it would be to translate that into real life. For me it doesn't seem like behaviour that belongs in the real world any more than jet packs and aliens.

    Pony raised the interesting point of the line being blurred by having a situation of psudo-violence that is near as makes no difference real. Bottom line after mulling it over I think for me the underlying knowledge that it's fake would allow me to compartmentalise the experience into my "not in real life" head space.

    But as ever it depends on the person in question. The argument I always make to your average Jack Thompson is "A video game won't make someone violent, it could make someone who is already violent worse but then it is only one of a million stimuli that could set off that behavior". In other words simulated violence doesn't make people killers the enjoyment of the violence has to be there in the first place.

    I think the same argument holds true here. You would have to be a fairly messed up person to buy a robot to rip it eyes out and hear it scream to start with. Any normal person who for whatever reason tried it out would probably be turned off the experience.

    That's only partially correct.

    It's like this: In order to enjoy a sex-droid set to "simulate rape", you'd have to enjoy simulated rape. You'd already have to have a rape fetish, going in, or at least the seed of curiosity that might lead you to want to try it out and explore that potential aspect of your sexuality.

    It doesn't make you a rapist, nor does it make you significantly of a greater statistical chance to actually rape a person, and it may very well be likely that actual depictions of real rape disgust you and the idea of actual sexual assault is unpleasant.

    It's a fantasy.

    The fantasy, or at least the curiosity to try the fantasy, has to at least be there for you to want to try it.

    But, don't discount people's abilities to start to slide into extremes as they feel more comfortable in exploring something from a detached and non-real standpoint.

    Like the study that was posted a while back about how people who observe a lot of porn inevitably start getting into more "fringe" fetishes over time to satiate their curiosity and interest, people given the option to "safely" explore an aspect of themselves they might otherwise think horrible will often take it.

    And, having taken it, may find they enjoy it. That's generally how people get absorbed into the really bizarre fringes of pornography, for example.

    2Girls1Cup might be a shock video that results in revulsion for some folk... but there are dudes who saw that, were repulsed originally... and then sorta found themselves kinda turned on.

    That's just how people are, on some level or another.

    So, while you might want to tell yourself "Only an already fucked up person would jam a pneumatic drill into the bare thigh of a robotic woman to listen to her howl in simulated torment"....

    That's not entirely accurate, especially when a person can so confidently assert that "this is not real, it's no different than a toaster, it's totally okay".

    People are capable of self-justifying all sorts of things as long as they can tell themselves it's not real, it doesn't matter, or whatever they are doing a horrible thing to isn't a person and doesn't matter.

    It's basically the origin of almost every historical atrocity. Our ability to try to overwhelm our empathetic emotional responses with a mental mantra of how divorced the subject is from ourselves.

    Pony on
  • CasualCasual Wiggle Wiggle Wiggle Flap Flap Flap Registered User regular
    edited January 2010
    Pony wrote: »
    Casual wrote: »
    I think the same argument holds true here. You would have to be a fairly messed up person to buy a robot to rip it eyes out and hear it scream to start with. Any normal person who for whatever reason tried it out would probably be turned off the experience.

    That's only partially correct.

    It's like this: In order to enjoy a sex-droid set to "simulate rape", you'd have to enjoy simulated rape. You'd already have to have a rape fetish, going in, or at least the seed of curiosity that might lead you to want to try it out and explore that potential aspect of your sexuality.

    It doesn't make you a rapist, nor does it make you significantly of a greater statistical chance to actually rape a person, and it may very well be likely that actual depictions of real rape disgust you and the idea of actual sexual assault is unpleasant.

    It's a fantasy.

    The fantasy, or at least the curiosity to try the fantasy, has to at least be there for you to want to try it.

    But, don't discount people's abilities to start to slide into extremes as they feel more comfortable in exploring something from a detached and non-real standpoint.

    Like the study that was posted a while back about how people who observe a lot of porn inevitably start getting into more "fringe" fetishes over time to satiate their curiosity and interest, people given the option to "safely" explore an aspect of themselves they might otherwise think horrible will often take it.

    And, having taken it, may find they enjoy it. That's generally how people get absorbed into the really bizarre fringes of pornography, for example.

    2Girls1Cup might be a shock video that results in revulsion for some folk... but there are dudes who saw that, were repulsed originally... and then sorta found themselves kinda turned on.

    That's just how people are, on some level or another.

    So, while you might want to tell yourself "Only an already fucked up person would jam a pneumatic drill into the bare thigh of a robotic woman to listen to her howl in simulated torment"....

    That's not entirely accurate, especially when a person can so confidently assert that "this is not real, it's no different than a toaster, it's totally okay".

    People are capable of self-justifying all sorts of things as long as they can tell themselves it's not real, it doesn't matter, or whatever they are doing a horrible thing to isn't a person and doesn't matter.

    It's basically the origin of almost every historical atrocity. Our ability to try to overwhelm our empathetic emotional responses with a mental mantra of how divorced the subject is from ourselves.

    While your porn example is relevant there are magnitudes of difference between that and robotic violence. Not to mention that sex and violence are two different things. I have a deep biological need for sex, I don't have a need for violence.

    Sure I have survival instinct and the desire to protect what is mine and as I said in my last post it is even possible for me to enjoy violence. However I don't NEED it. The two things are on completely different levels.

    Also the slow slide from soft into hard porn is not really comparable either. Being on a porn site looking at your usual porn and noticing then following a link for bestiality or whatever is not then same investment of time, money and effort as ordering a torture robot. That first step of deciding to get the robot (not to mention actually to inflict simulated harm on it) is on a completely different level. Following the link is free and simple it's a momentary impulse action.

    I still think I'm justified in saying this is something you would have to have been heavily invested in the first place. You would have had to have thought and fantasised about it to do it.

    Casual on
  • Eat it You Nasty Pig.Eat it You Nasty Pig. tell homeland security 'we are the bomb'Registered User regular
    edited January 2010
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Eat it You Nasty Pig. on
    NREqxl5.jpg
    it was the smallest on the list but
    Pluto was a planet and I'll never forget
  • DasUberEdwardDasUberEdward Registered User regular
    edited January 2010
    Dyscord wrote: »
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Hahaha. No.

    DasUberEdward on
    steam_sig.png
  • PonyPony Registered User regular
    edited January 2010
    Dyscord wrote: »
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Given that horrifyingly corpse-like real-dolls both exist and are profitable I think this idea is laughably wrong.

    Pony on
  • emnmnmeemnmnme Registered User regular
    edited January 2010
    *peers into crystal ball*
    The year is 2159. The world is a futuristic utopia complete with robot butlers and flying cars. And rape is a $50 fine instead of a felony.

    emnmnme on
  • CasualCasual Wiggle Wiggle Wiggle Flap Flap Flap Registered User regular
    edited January 2010
    Pony wrote: »
    Dyscord wrote: »
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Given that horrifyingly corpse-like real-dolls both exist and are profitable I think this idea is laughably wrong.

    Are they profitable because lots of people buy them or because they're very expensive as an extreme niche product?

    I mean how many guys even buy a fleshlight let alone a full on sex doll?

    Casual on
  • PonyPony Registered User regular
    edited January 2010
    Casual wrote: »
    Pony wrote: »
    Dyscord wrote: »
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Given that horrifyingly corpse-like real-dolls both exist and are profitable I think this idea is laughably wrong.

    Are they profitable because lots of people buy them or because they're very expensive as an extreme niche product?

    I mean how many guys even buy a fleshlight let alone a full on sex doll?

    Enough that it is a profitable industry that's existed for a few years now and looks like it's not going anywhere.

    Masturbatory aids, especially those intended for a male audience, have always been a niche market because the reality is the majority of dudes (for various reasons) just won't buy them.

    but they still exist.

    "It will always be too expensive" is a bad argument too.

    it'll probably stay about as commonplace as other kinds of masturbatory aids, even if it gets fairly inexpensive.

    but there will be a market for these things

    Pony on
  • Eat it You Nasty Pig.Eat it You Nasty Pig. tell homeland security 'we are the bomb'Registered User regular
    edited January 2010
    Pony wrote: »
    Dyscord wrote: »
    It's also completely possible that we will invent rapesexbots only to discover that no one can make any money selling them, because what 99% of people actually want is an idealized, relatively removed fantasy.

    Given that horrifyingly corpse-like real-dolls both exist and are profitable I think this idea is laughably wrong.

    Okay, so I'll modify my statement to "it's also completely possible that we will invent rapesexbots, only to discover that hardly anyone will buy them, because what 99% of people actually want is an idealized, relatively removed fantasy."

    I think you are wandering farther and farther from the idea that these robots will have some large social impact.

    Eat it You Nasty Pig. on
    NREqxl5.jpg
    it was the smallest on the list but
    Pluto was a planet and I'll never forget
  • HamHamJHamHamJ Registered User regular
    edited January 2010
    This whole argument is dumb, because it is based on pure speculation. There's absolutely no sound empirical reason to think that this would have a net negative effect.

    You can make up all the scenarios and thought experiments you want, but without hard statistical evidence your opinion means jack shit.

    HamHamJ on
    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • DetharinDetharin Registered User regular
    edited January 2010
    The problem here is people just are not thinking about the children. Will someone please think of the children!


    The sexy, sexy robot children programmed to act like they dont like it but secretly programmed to really like it.

    Detharin on
  • AmiguAmigu Registered User regular
    edited January 2010
    This topic is such a can of worms.

    Amigu on
    BitD PbP Character Volstrom
    QEz1Jw1.png
  • BarcardiBarcardi All the Wizards Under A Rock: AfganistanRegistered User regular
    edited January 2010
    Huh, i read this thread hoping to figure out the plot to ghost in the shell and if anyone in that series was even human, or if that psychologically/philosophically even matters... and i end up reading about robot rape. Oh, internet.

    Barcardi on
  • Kevin R BrownKevin R Brown __BANNED USERS regular
    edited January 2010
    HamHamJ wrote: »
    This whole argument is dumb, because it is based on pure speculation. There's absolutely no sound empirical reason to think that this would have a net negative effect.

    You can make up all the scenarios and thought experiments you want, but without hard statistical evidence your opinion means jack shit.

    This, pretty much.

    Pony, in order for your argument to be valid, you need to demonstrate that both:


    a) Desensitization occurs in humans as a result of using sex toy robotics to play out rape fantasies.

    and

    b) Desensitization of rape brings negative consequences.


    Science is about speculation and data, not just the former thing.

    Kevin R Brown on
    ' As always when their class interests are at stake, the capitalists can dispense with noble sentiments like the right to free speech or the struggle against tyranny.'
  • Bliss 101Bliss 101 Registered User regular
    edited January 2010
    HamHamJ wrote: »
    This whole argument is dumb, because it is based on pure speculation. There's absolutely no sound empirical reason to think that this would have a net negative effect.

    You can make up all the scenarios and thought experiments you want, but without hard statistical evidence your opinion means jack shit.

    This, pretty much.

    Pony, in order for your argument to be valid, you need to demonstrate that both:


    a) Desensitization occurs in humans as a result of using sex toy robotics to play out rape fantasies.

    and

    b) Desensitization of rape brings negative consequences.


    Science is about speculation and data, not just the former thing.

    If such irrefutable, unambiguous evidence existed, this thread would be pointless, wouldn't it? Why debate something if the answer is obvious right from the outset? Or are you saying that nothing in our current knowledge of psychology can even touch upon A and B, that there's nothing we can extrapolate upon or even use as a basis for educated guesses? Gather all of the world's leading psychologists to examine the two claims above, and all they could do would be to shrug and go home, defeated?

    Psychology/social psychology is a notoriously complicated science, partly because for the most part they need to operate with statistical association rather than evidence of direct causality. I don't think the evidence you demand is going to be available until such robots have been on the market for a while. That doesn't mean there isn't any data to base on argument on.

    Bliss 101 on
    MSL59.jpg
  • PonyPony Registered User regular
    edited January 2010
    HamHamJ wrote: »
    This whole argument is dumb, because it is based on pure speculation. There's absolutely no sound empirical reason to think that this would have a net negative effect.

    You can make up all the scenarios and thought experiments you want, but without hard statistical evidence your opinion means jack shit.

    This, pretty much.

    Pony, in order for your argument to be valid, you need to demonstrate that both:


    a) Desensitization occurs in humans as a result of using sex toy robotics to play out rape fantasies.

    and

    b) Desensitization of rape brings negative consequences.


    Science is about speculation and data, not just the former thing.

    You fucking people. God forbid people have an ethical debate based on opinion and speculation! No, no, it has to be a pissing contest of statistical data and SCIENCE!

    Can't just have a discussion on what people think and feel on a subject, no, no, can't have that, the only valid debates are ones where things can be objectively proven instead of thought about and explored from a basis of what people think on the subject.

    If you think this thread is pointless and the discussion meaningless because it's purely speculative... don't fucking post in it.

    Pony on
Sign In or Register to comment.