So, this was linked to me by a buddy of mine:
It's nothing really amazing, in and of itself. It's a Real-Doll with some upgrades to make noise and a few other features. It's certainly not a sex-droid on par with what is seen in sci-fi.
But it's a step. It's a step towards things you and I are likely to see in our lifetime.
Ultimately, something like in the above video is a masturbatory aid. It's on the same level as fleshlights, Real-Dolls, and a watermelon lightly heated in the microwave with a hole cut in it.
Doesn't have any appeal to me, but some people are into that sort of thing in the privacy of their own bedroom and stuff. That's fine. If you are the sort of dude who finds your own hand insufficient and needs a device or a machine
, well, that's your business.
But here's the thing: This is the beginning of "progress" in a certain direction.
A direction of fuckable robots
While the above "robot" is barely such, that won't always be the case. Progress marches on, and so on. People will build on ideas like this, advance them, refine them.
There's a market for sex-droids. We all know there is. Even if that market doesn't include you, and doesn't include me, it includes lots of people and if there's a reasonable market for something, eventually people will come along and develop product for that market.
So, as progress marches on, eventually these kinds of sex-bots will become ambulatory. They will have movement, expression, and more deep and complex capabilities for expressions and programmable... "interests".
This is just a thing that's going to happen.
Here's my issue with this sort of thing: While most people are familiar with the "Uncanny Valley"
as it relates to the appearance
of not-human things trying to look human, there's also an intelligence
Any kind of sex-robot is going to need some kind of programmed behavior. They're going to need some kind of rudimentary "AI", even if on the most basic level that AI is just "play moan.wav once entered"
As these sorts of machines are developed to have more mechanical functions, they will have more software complexity as well.
Where that intelligence Uncanny Valley comes in is when you have created a robotic facsimile of a human that can exhibit
an eerily similar approximation of human behaviors, but without being aware of them or really "feeling" them in any sapient way.
It wouldn't be a person, it would be a machine, but it would appear
to act and move like a person. It doesn't even have to be to the level that it can pass a Turing Test to be creepy as fuck on a primal emotional level for a lot of people (myself included!) but what is more important to me is the ethical questions
these things start to raise when you get to that level.
I'm not talking about "robotic people". I'm not talking about sentient and sapient AI that are capable of saying "Why do I exist?" I'm not chicken-littling over the idea of one day our sex-droids "going Skynet" or some damn thing.
That's sci-fi. The likelihood of us seeing human-equivalent AI in our lifetime is not exactly high, and isn't a component of what I am talking about here.
No, what I'm talking about is a much closer point we are rapidly approaching in artificial intelligence and robotics: Machines that are capable of appearing
sentient, sapient, emotional, or otherwise "feeling", regardless of the objectivity of asking whether or not such things are "real".
Consider this: You have a robot that is capable of moving and vocally responding to sex as if it was a person. It's not going to pass any kind of Turing muster, but it's pretty close. Some dude owns, and fucks, this robot. Really, this isn't much different than an overly complex real-doll or fleshlight.
Now, given public sensibilities about privacy and not wanting to gross out your friends and neighbors, chances are most people who own a thing like this are going to keep it out of sight when it's not in use, and it's likely he'll never speak of it to his friends or anything.
Again, reasonable, and not much different than the masturbatory aids we have today.
There's not much in the way of ethical issues at work here. If the dude just sorta leaves his sex-bot lying out on the couch naked with semen stains on it when company comes over, or insists that his friends acknowledge his sex-droid as his "girlfriend", obviously reasonable people are going to find him an unpleasant creep and shit, but that's no different than a guy leaving his sticky porn mags on the coffee table when he invites you over or something. Creeps are creeps. Owning a sex-droid wouldn't necessarily
make him a creep, especially if you never know about it.
But, there's icky places this starts to go.
Say, for example, this fellow has his sex-droid designed to move and vocally respond as if it's being raped
, instead of simple consensual sex? That's a fetish for him, and he's had his robot programmed to respond to that and act in a way that fulfills his fetish. People have rape fetishes. These people exist, and might be more numerous than you think. For the most part, these people are more fascinated with the idea
of rape, rather than the actual act, and won't actually become a rapist and wouldn't want to be raped for real.
But they'll watch simulated rape porn, and jerk it while fantasizing about rape. This might make you recoil in disgust (I know it makes me sick!) but the reality is that many of these people are just fantasizing, and are no more likely to actually rape someone as anyone else. They won't watch actual
footage of rape, but things designed to emulate it certainly turns their crank.
Watching rape-simulation porn, or even fucking a sex-bot designed to emulate being raped, isn't the equivalent to rape of a human being. The robot, in this example, isn't a person
. It doesn't have sentient or sapient intelligence, it's not even comparable to an animal.
Yet, is fucking a sex-droid that is emulating being raped the same thing as simulated rape porn? The argument that people make in defense of things like simulated rape porn is that it isn't real
. And certainly, the sex-droid isn't really
However, we are creatures motivated by our sensory responses. If the robot can emulate the physical and vocal responses of a rape victim, at what point does your "it's not real!" defense become a little shakey? It's one thing to look at images of something, knowing it's actors performing and not real, but it seems like it's quite another to start directly emulating the act yourself on an object.
People's grips on the "real" are not as iron-clad as some folk might romantically like to think. While I'm not arguing that a person who jerks it to simulated rape porn, or "rapes" a sex robot, is de facto on the road to becoming a rapist... I think it would be a little naive to say that it's impossible that this sort of line-blurring via machines won't result in... something deplorable.
Human beings are empathetic. We empathize with each other. Despite what some folk might like to argue, the primary motivation for ethical human behavior isn't fear of divine retribution. People don't need necessarily churches or religions to be moral. They just need society and culture to reinforce their basic emotional capacity to feel compassion for others.
If a person surronds themselves with sensory input that is directly contrary to this, it is in fact very possible it will damage their ability to adequately empathize with others. This is the "desensitization" argument, essentially.
I'm not Jack Thompson or something, arguing that "murder simulators" are going to turn our children into spree-shooters. I'm not even arguing that people who jerk it to simulated rape porn are going to become rapists.
But, what I am saying is that the progression of robotics and AI, and more importantly how we treat them
, is potentially damaging to basic human empathy.
The closer robots get to humans, the more important it becomes that we maintain awareness of this fact. It's hard to argue that a man is going to lose his grip on the separation between reality and fantasy and move from having a rape fetish to becoming a rapist when all he does is jerk it in front of a computer monitor.
However, when he's fully engrossing himself in the physical experience of the act, it's hard to argue that it's not at least starting to desensitize him to the real deal
. It's one thing to watch something, but if you find yourself able to grab, restrain, physically overpower and ignore the extremely accurate
vocal protestations of your synthetic "victim" and still maintain arousal and enjoy the act... I think that sensory input can damage your empathy, in the long term.
You might find yourself saying "But, that's no different than a couple consensually simulating a rape scenario. If a man is having a rape fantasy with his wife, and they've got safety words and other elements in place that keeps it just a fantasy
, how is the robot any different?"
Because the robot can't give consent, it's not a person. Even as a husband is working out a rape fantasy with his wife, he's still a conscious level of detachment from actual rape because he knows
, consciously, that his wife is still consenting and that she's still a person who wants it and is capable of actually stopping it if she doesn't.
The reality is, things like rape aren't just about sex, they're about power
and actual rapists are usually seeking power
over their victims, not just sexual gratification. Rapists don't generally see their victims as people
. They don't see what they did as wrong, because "she really wanted it", or because "it doesn't matter, she's just a whore".
It's power by objectifying your victim. When your "victim" is already an object
, this makes it a different thing than a consensual participant you know and acknowledge as a person.
So, a person emulating the act in such a way is gaining all the sensory input
of the act. They are getting the sensation and feeling
of the act, but removing that crucial element of consent
from the fantasy. While they aren't equivalent to people performing the real crime, they're basically exposing themselves to an identical mindset and stimulae and enjoying
it. I think that this is probably not a good thing
for a person's psychological state and empathetic relationships to others.
This doesn't just apply to rape fetishists, either, though. This can also be applied to people interacting with robots and AI that are on that level of being human simulations in other socially and emotionally reprehensible ways.
Some people play a game like the Sims only to torture the little digital folk. They build houses without exits and watch the Sims go mad with isolation. They starve them to death. They watch in amusement as the Sims accidentally set fires, panic, and burn to death. They'll trap them in a pool they can't climb out of, and watch them eventually fatigue themselves into drowning.
This is sorta morbid, but really isn't all that ethically repugnant and is hardly going to turn the person into a sociopath. It might lead to them torturing goldfish or some other kind of similar creature, but that's unlikely.
But, what's important to note here is that people are watching the Sims emulate suffering because the Sims are designed to be able to emulate suffering
. Even if a Sim's AI is programmed to acknowledge that "if my Hunger bar empties, I will starve to death and die, and that is bad", it won't automatically start emulating the visible emulations of how we starve... unless we program it to do so.
A Sim that is starving will clutch its stomach, moan in pain, and even refuse to obey commands to do anything. It will emulate
suffering, in a way we humans recognize, even though it doesn't have to
in order to function. We've made it look like it is suffering. It's not "really" suffering on a level we can equate to a living thing, but golly... it looks like it.
Torturing Sims isn't going to lead to you torturing people... but torturing human-facsimile robots sure as fuck might.
Because Sims are little clusters of polygons on your computer screen. A robot is a physical being, in front of you. We are sense-oriented creatures, and when something can emulate nearly all the sensory input of something, we can in fact lose touch of it "not being real".
A person watching an actor have a special FX facsimile of an eyeball being ripped out of their head in the movie "Hostel" knows what they are seeing is fake. Might be a little morbid to enjoy such a thing, but it's certainly not the same thing as doing it and isn't likely to lead to you acting it out on folk.
However, the same person ripping the eyeball out of a very human-appearing robot (complete with fake blood!) while it screams in (simulated) agony is probably not doing something healthy to their psychological state. Even though the person knows the robot isn't real, even though they know consciously that the robot isn't a person and is only emulating these respones and isn't "really" feeling them... it's still desensitizing them to the act. They are still going through the identical motions of the act.
Simulation can be dangerous when it's so real it desensitizes
you to the real thing. It doesn't mean you will lose sight of reality entirely, and see the simulation as what is real, but it does mean your emotional and empathetic reaction to the real presentation will be blunted
I don't think that's healthy!
Human-simulation robotics, in my opinion, is dangerous territory in this regard. As it becomes more and more advanced and makes human simulation more convincing, I think it can be ultimately damaging to the societal value of human life.
So, I see videos like the one at the top of this post, and I become very uncomfortable
How do you feel?