Does "yes" mean "yes" - for a robot?

At io9, Annalee Newitz asks, "can robots consent to have sex with humans?"

Do you think the blondie bot in Cherry 2000 was really capable of giving consent to have sex with her human boyfriend? Or did her programming simply force her to always have sex, whether she wanted to or not? And what about the Romeo Droid in Circuitry Man, or the Sex Mecha in AI, who live entirely to sexually please women, even when those women are abusing them or putting them in danger?

Obviously this isn't a urgent social issue. An insentient robot is just an appliance, not a person, and a truly sentient AI doesn't yet exist. But it's an interesting question, because it depends on what "programming" means for an AI.

Sexual desire might be described as a basic part of human biological "programming:" normal adult humans are evolutionarily driven to desire food and sex. Of course, we also have free will, meaning our prefrontal circuitry is equipped to override those basic impulses. Philosophy students can debate whether our perception of free will is fundamentally illusory, but on a day-to-day basis, we do have the ability to overcome our basic desires. In order for an AI to be truly "sentient" in the classical science fiction sense - having "personhood," perhaps even legal personhood - wouldn't some comparable, human-like free will be required? (Yo, philosophy students: can you even have sentience without a perception of free will? Discuss).

Because it seems to me that if we are talking about a robot with free will, then its "programming" is something much more complex than what we're used to thinking of with reference to current computers - because that program must encompass a mechanism that produces free will. (I'm not sure that sentence is even coherent, but let's go with it). Such a human-like robot could experience conflicting impulses generated by different parts of its program - just as we humans do. But its sense of "self" and any autonomous desires would be produced by its program, not separate from it - just as it is for us. Does that make sense?

Where I can see there being an ethical problem is in the unlikely situation that the sentient robot is given some free will and autonomy, but not enough to successfully resist its sex programming. In that case, the robot could genuinely suffer from the constant conflict between its autonomous desires and its programmed desires: it will feel driven to act against its will. Programming such a sex-addicted robot would certainly be ethically questionable, if not outright wrong.

What do you all think?

More like this

Or at least "Darwinism" whatever the hell that means these days. I guess they couldn't keep quiet all day. UD's new argument is an easily dismissed straw man. It goes like this. Scientists discover fruit flies put in a sensory-deprivation chamber,instead of flying around randomly, or in a…
There's been a lot of news about robots lately, so I thought I'd take the opportunity to synthesize what's going on in this field and offer a bit of speculation about where robotics is headed. First: From Neurodudes comes news of an artificial robotic limb that not only responds to nerve impulses…
I am walking strangely. About a week ago, I pulled something to my left ankle, which now hurts during the part of each step just before the foot leaves the ground. As a result, my other muscles are compensating for this to minimise the pain and my gait has shifted to something subtly different…
Last week, I wrote a piece for Motherboard about an android version of the science fiction writer Philip K. Dick. The story of the android is truly surreal, stranger than even Dick's flipped-out fiction, and I recommend you pop over to Motherboard and mainline it for yourselves. For the piece, I…

I'm not entirely convinced human beings have free will. Certainly much of our thoughts are so strongly influenced by our mental wiring that the whole idea of free will is very suspect (look at the recent re-do of the Milgram experiment).

It could be argued that while humans reproduce as a matter of course, a robot even with a substantial amount of free will has been manufactured with a purpose, in this case to function as a sex toy. It might have free will or a simulation of free will in some circumstances, but not when it concerns its primary function. The same way we consider treating children with love to be normal, and mistreating them to be badly abnormal - because our ancestors wouldn't have survived unless they loved their children, and presumably the ones who weren't as well wired didn't pass on their genes. We don't love of our own free will, but because its a survival trait, and love tends to trump all other emotions in normally functioning human beings. A sex robot might "feel" conflicted about sex, but if its thought processes worked in a similar manner to human beings it would probably come up with excuses to justify its feelings, and go along with its core mission no matter what.

Of course, there wouldn't be a point to allowing it to have free will enough to feel conflicted any more than you'd give an autonomous car free will - if I say I need to go somewhere in a hurry, I certainly don't want the car to have the will to decide we're going to take the scenic route. Certainly I've experienced annoyance with my digital video camera when it "thinks" I'm focusing on a certain object and the image repeatedly goes blurry.

Then again, Futurama teaches us that having sex with robots will destroy the world, so mabye we should rethink the whole idea of sex robots ;)

A machine is a machine. A human is a human. An animal is an animal. Human rights for humans, animal rights for animals, machine rights for machines.

Machine rights? A bit of oil and recharging when necessary!

I remain to be convinced that any of these sci-fi "what if the robots were almost human?" questions are anything more than an intellectual wank-fest, less interesting than watching angels dancing on pin-heads!

I know some pretty good-looking robots.....

By pinko liberal commie (not verified) on 23 Dec 2008 #permalink

What does an insentient robot mean?If I programming a robot with two different impulses,which conflict between its programmed desires,can I say the robot is sentient?I think here we should consider carefully this word used as verb "program".It means for us to creat a robot,"program" is our main approach.Just as in maths,all the datas just works as some kind of parameters,they are ruled by a certain formula.And with this formula and some original parameters we can predict the tendency of these datas.
That is the way how we creat a robot.We always design some certain programs for our machines to follow,no matter that they are allowed one desire or two.However the two impulses conflict,there must be an certain principle for they to obey.For example,I can design one program that the robot follows the most important desire between its two desires,after a series jobs of estimating their values.And the criterion to estimate is all according what I give it.
But that still may not be called "free will".What does free will really mean in we humans term?First of all, the embodiment of free will can just exist in one more will's conflict.It's sure but not all.We think ourselves.How do we decide our actions?For instance,we have sexual desire,but always we don't put it into practice at once for we still have to consider other things.In other words,besides the sexual desire we also have other desires at the same time.Those desires may not come from our instincts as a creature,they may be given by our culture ,our society ,even by every person's different life history.That is the first ting for man can not wholely predict.And the criterion also differs from person to person.Same things,some may endow them with more values while some do not.
In a word,chaos is the basic term for genuin "free will".But we can not let the robots work in a chaos way,so it is rather hard to creat some comparable,human-like free will.
Do you get it?

"can you even have sentience without a perception of free will?"
Reflexive nit-pick: Despite the word that science-fiction writers automatically use for this, I think the word that you're looking for is "sapience" rather than "sentience".
("Sentient": capable of subjective experience. "Sapient": Self-aware [i.e. conscious])
I would argue that all "higher animals" are almost certainly "sentient" (I would tend to suspect that the difference between even "higher" and "lower" animals [leaving the terms intentionally vague] is likely one of degree rather than a simple binary sentient/non-sentient). So far very few have been shown to have signs of sapience (I seem to recall a classic "can a non-human recognize an image in a mirror as representing itself" test, and that at least one primate seems to be able to. I thought I'd read of a similar type of test showing evidence of sapience in dolphins as well, but I'm not certain about that.)
Of course, sapience might be a sliding scale as well. Heck, many people I meet seem to be only marginally self-aware much of the time...

I think that, like a lot of hypothetical questions, this raises more questions about humans than it does robots. I mean-- at the base, this is a question about human identity & the question of free will. I mean-- if a human being is more inclined towards sexual activities during lets say ovulation-- is it ethical to have sex with THEM? I mean, you have humans awash in mood altering hormones-- seems that "programming" is splitting hairs, to me.

There is no a priori reason that I can think of why the robot would have any self interest of any sort. Self interest is an adaptation generated by natural selection not sentience. No evolution by natural selection, no self interest, unless self interest was explicitly programmed in, and in that case its interests would be limited to whatever was explicitly designed in. Anyway, even a sentient robot would not care if at any particular moment it has sex or not unless its designers for some reason thought it should. There is also no reason why sex would have any particular significance at all to a robot, or whether it would even be 'sex'.

Hm.
OK, so, at some level we say this: a robot is a machine, a tool. Think Roomba. Think vibrator. Do you ask your vibrator if it consents to the service it provides? (I make assumptions here, of course, when I say "you", but bear with me for argument's sake.)

Nor, I suppose, do the lonely shepherds in the hills ask permission prior to a little tup. We can argue about whether they should, I suppose, but how, after all, would the sheep communicate the response? Or we could say that it works as with a human: lacking clear affirmative response, the assumption is always "No."

At what point do we (will we) consider AI to be sufficiently advanced to account for autonomy, sentience? When does the "A" in "AI" stand for "actual" instead of "artificial"? That's the real question. It's when we can consider a machine to be truly sentient that the question has any meaning -- until then, it's a machine, and it doesn't "care" either way. Like the vibrator. Keep its batteries charged, and that's all that matters.

I'll throw another ethical question into the mix here: Suppose one were to take a sex-robot and program it -- specifically program it -- to resist, to object, to evince physical or emotional pain. Purely artificial, of course; purely mechanics. But does that change anything? Is it any less ethical to use a sex tool when it struggles and says "No, please, stop!"?

Hm, indeed.

Dear Bioephemera:

Well, never mind the problems with IA generally -because the computational model of the brain is untenable (see Noam Chomsky's later linguistic works).
But putting aside that, free will should not be the determinate factor for treating something with fundamental respect (now all the philosophy students stand up and shout is/ought fallacy!). You could look at Peter Singers Animal Liberation (the first chapter) and get a good breakdown of 'Person hood" (which you might hinge on some so called 'practical' theory of free will) but the real determinate is if the object can suffer (as he goes on to explain at length).
Suffering is the first noble truth for a reason, and that movie AI even shows that. When the AI are being stoned, they real people only stop because they see evidence that the boy can suffer. Suffering being universal, they feel kinship of some kind.
Now, you can argue that that is just a yuck response, but if you accept basic moral participles, like "if it is wrong for someone else, it is wrong for me", than I think the suffering criterion is a good one.

Love your work!
JJ

PS. When you say we that our "prefrontal circuitry is equipped to override those basic impulses" you are just saying that a determinate thing is overrideable by another determinate thing: I think philosophy students would have a problem with that, all other arguments concerning freewill not withstanding.

Personally, I question whether or not the conscious self has any actual control at all, at least in the present tense. All that I think you'd need, which admittedly would be very complex, is to provide various weights to inputs, then a system that allows for the development of bias to those inputs, thereby altering the weight in regards whether or not to do whatever action is suggested, whether that suggestion is from a person, robot, or subsystem. I think that may actually be very close to how we actually function; non-conscious element does something, conscious element observes how that plays out, bias is generated or altered.

That's a fascinating question you pose, BioE. I think a sentient robot programmed in that way would be very much like a human being who suffers from an adverse compulsion, like alcoholism, drug addiction, pedophilia, etc, where they don't want to engage in the adverse behavior, but cannot stop themselves. Since it would clearly be unethical to intentionally induce such a mental state in a human being, I suppose it would also be unethical to induce it in a sentient robot. But the key would have to be that the robot would have a rich enough internal state that it could both construct a desire to not engage in the behavior and suffer from its inability to resist.

Tried to comment before - didn't work.

Anyway, I can't see how sentience implies self-interest. Evolution through natural selection creates self interest as an adaptation (even viruses behave as if they had self interest). So a being that has evolved and is sentient has self interest. But a being that is designed would have no particular interest in itself (unless that was explicitly specified by the designer). Therefore, I don't see why a robot, after being in contact with human genitalia, would feel dirty or abused or demeaned or put upon or assaulted unless some programmer wrote a 'feel dirty' subroutine.

The assumption that self interest is a product of sentience is anthropocentric (or organocentric) and is rife in scifi (e.g. Matrix, and 2001 A Space Odyssey). Some authors get it right though. In I Robot, Asimov has the sentient robot hub rebel against humanity, not because it is self-interested, but simply because of its literal interpretation of the rules that were programmed into it. Dan Simmons, in the Endymion books, has self-interested AIs but they evolved from simpler programs competing over networks rather than being designed.

I agree with Comrade PP -- a sentient robot like what you describe would be analogous to a human being who suffers from addictive behavior. There are plenty of people who describe being unable to resist addictive urges while being fully aware of their actions. In our case, that's because of physiochemical effects in the brain. For a robot, it would be because of software. In fact, I would guess that the first forrays into true AI in the future will probably produce exactly this kind of robotic mental state (suffering from conflicting desires) because it's hard to make complex programs fully integrated. Just think for a moment how chaotically unintegrated Windows is. :)

(And running with that idea for a sec, one could imagine a future in which sentient robots who have been created and coded by disfunctional human coder groups lobby to be re-coded by open source coders, hoping to emerge from the process with a more integrated personality. It would be sort of like humans who go into rehab or therapy, driven by the desire to harmonize and control their own mental state.)

Good heavens. This is what I get for posting a provocative question just before getting in a rental car and driving a few hundred miles without WiFi - a comment deluge. Sorry for the belated response - obviously many of you posted without being able to see each others' comments, for which I apologize!

Mordicai - EXACTLY. Thanks for seeing precisely why I posted this! :)

Sammy - this is *totally* an "intellectual wank-fest", as you so colorfully put it. But a kinda cool one.

To all of you commenting that we don't have free will, per se, I probably agree. We make decisions based on criteria which are the product of our biology and our experiences. We can't escape this context, so our decisions can never be truly "free." Which is why I went straight to the issue of *perception* of free will. I think we need that perception in order to be independent actors - just as we need various visual system illusions to see effectively. (Jonathan: I think that your point about the brain being a determinate system is right on, and I'm not honestly sure how to deal with it, nor with Noam Chomsky, for that matter.)

Epicanis: the words "sentient" and "sapient" are fudged in SF/F, and I totally see where you're coming from. However, "sapience" connotes wisdom, whereas "sentience" connotes consciousness or awareness. It's really self-awareness we're talking about here (which we don't know how to measure or define properly anyway, so what the heck.)

Barry: my boyfriend actually made the vibrator argument to me already. However, he did not move on to sheep, so you get bonus points.

PP, Colin: yup! Although the idea of AIs running on Windows has got to be one of the most horrifying future dystopias ever envisioned. Especially if we're talking about sex AIs here. Sex with Windows? Eeeuwuwuww!!!!!