Nine days of 9 (part 2): Does AI require emotional understanding?

It's time to win another prize! Every day from now until the 9th, someone will win a fabulous limited edition companion book to the upcoming animated feature 9! It's a fabulous addition to your coffee table. Only 999 copies made! Marvel at glorious stills from the film and read comments from the film crew. Be the envy of your friends, etc etc.

Yesterday's winner, selected by the cool capriciousness of random.org, was Simon. Simon wins the prize!

If you want to win today's prize, read on...

If you're a winner, this is what the delivery will look like:

DSCF6316.JPG

It's bigger than it looks. That's a big box.

So for a chance to win today's prize, cast your mind back to the exclusive essay by Ray Kurzweil featured two weeks ago. Ray posited that it was our ability to feel empathy with one another that enabled complex societies to form and led to the dominance of our species. Intelligence could not be measured soley by abstract reasoning or processing power, but included emotional intelligence. My question for you is this: is emotional understanding a necessary pre-requisite for artificial intelligence? Or could a race of autonomous, self-fabricating robots live without it?

Tags

More like this

Another day means another prize to win! In my continuing efforts to bring you, dear readers, the finest in merchandising loot there is to offer, I have given you the chance to win shoes, bags of goodies, and high-quality coffee table books. How do you get your hands on this lucre? Read on...…
Welcome to the penultimate chance to win an exclusive coffee table book filled with high-quality prints from the Tim Burton-produced Shane-Acker created animation 9. As far as I know, it can't be bought in shops, and your best chance to win one of the highly limited (999) copies is here at…
It's time to win another prize! You know the score - I give you a starter topic, you wax lyrical, someone wins a prize! It's like Who Wants to Be A Millionaire, but with less questions and a smaller budget. You might call it Who Wants to Win a Sweet-Ass Coffee Table Book?. Yesterday's winner, as…
Many thanks to everyone for their wonderful, thoughtful and altogether delightful ideas on what memories we should store for a post-apocalyptic world. Now, in case you're some knuckle-dragging moron who can't follow links or scroll down the page, or worse, a freakin' newbie, let me explain: I got a…

I think they could get along for a bit without it; maybe not forever tho

I wouldn't say that emotion is a pre-requisite of artificial intelligence so much as an inevitable consequence of it. In order to function an AI would require certain drives- to survive, to solve differential equations, to clean up trash- whatever it was programmed for. The influence of these drives on the mind of an AI sophisticated enough to think about whether or not it has emotions would be pretty analogous to what we humans describe as emotion, wouldn't it?

Of course, these AIs wouldn't necessarily experience the same kinds of emotion that we do. Humans may never understand the sweet melancholy that results from contemplating the Riemann Zeta Function, just as AIs might never understand the appeal behind Adam Sandler movies.

I'd say no.

My argument would basically be as follows: If empathy is not necessary, then obviously hypothetical AI-bots could do without it. If, however, empathy is necessary, its necessity would be a demonstrable matter. Possibly a really hairy mass of game theory; but not some unapproachable mystical truth. If the necessity is demonstrable, an unempathetic; but suitably intelligent, AI could grasp the demonstration.

Assuming the AI was aiming at survival, it would(merely on unemotional rational grounds) exhibit the necessary behaviors. The universe has no "motives test". Exhibiting the correct behaviors is all it would take.

It is quite possible that human empathy is a convenient shortcut to what are ultimately highly adaptive behaviors; but the very fact that we can say what those adaptive behaviors are, and explore why they are adaptive with reference to function, suggests that any entity capable of following that logic could emulate those behaviors(in fact, it would probably be easier for an emotionless AI, since it wouldn't have to deal with the unpleasant human tendencies of spite, sadism, envy, hatred, and so forth) and achieve exactly the same adaptive ends.

Let's begin with this: empathy requires a certain level of intelligence. It requires the ability to imagine -- to model -- others as similar to oneself and oneself as similar to others in important ways.

The converse seems to me false. There are individuals who lack empathy. We regard them as psychologically warped and socially dangerous, but not necessarily as unintelligent. That is the counter-example. The more general statement is that intelligence isn't a well-defined, cohesive faculty, and that it's easy to imagine some kinds of intelligence that don't require empathy.

I've known people with extremely limited empathic understanding who otherwise functioned in society (Asperger's sufferers, I'm sure, though I didn't ask). There are sociopaths who function well in society, without being out of line in many social norms (not all sociopaths are also psychopaths).

It's also my opinion that everyone who does have any empathic understanding approaches it from a different direction and has a different scope; for example, compassions I find important can be meaningless to someone else, and social forms someone else thinks are essential seem trivial to me. How often does someone seriously mis-predict another's response, because the other's emotional priorities are different? Isn't that an essential element of society? If we all had the same emotional bases, we'd already BE robots.

So no, AI's probably don't need social intelligence, emotional understanding of each other and humans, to work in a reasonable social group. And hey, AI's have a few huge benefits we dont: if they're working from the same dictionaries, all words will mean the same to them, without additional flavors imparted by connotation. They can hook directly to each other--as long as there's software--and get a look at what the other is really thinking, complete with text and graphics. The very sameness which I referenced in my previous paragraph can work *for* a whole social group--where, in humans, the lack thereof is a defining feature.

By Galadriel (not verified) on 01 Sep 2009 #permalink

Let's begin with this: empathy requires a certain level of intelligence. It requires the ability to imagine -- to model -- others as similar to oneself and oneself as similar to others in important ways.

The converse seems to me false. There are individuals who lack empathy. We regard them as psychologically warped and socially dangerous, but not necessarily as unintelligent. That is the counter-example. The more general statement is that intelligence isn't a well-defined, cohesive faculty, and that it's easy to imagine some kinds of intelligence that don't require empathy.

ı agree

Emotions in some form are necessary for intelligence. Motivation is emotion, after all. Without motivation the entity will be only an automaton, or a catatonic.
Another typical feature of intelligence is generalizing. I strongly suspect that when a mind-like construct capable of generalizing is given motivational emotivity, this emotive facility will generalize to comprehension of motivations of others -- which is getting pretty close to empathic ability.

The key word is not 'emotional' but rather 'understanding'. Until AI manages to produce something akin to the understanding of anything at all, then whether it needs emotional understanding is a moot point.

It is clear that understanding is not required in order to be autonomous and self replicating at the microscopic level. However, at the human scale the world is sufficiently varied and complex that any mechanism incapable of understanding its environment will not be able to survive unaided. Once we have cracked that one, we can start to worry if emotional imperatives are a good idea.

In my emotion is our perception of electro-chemical signals in our nervous system, which we percieve as motivational drives towards our goals. Does AI need this? Given multiple goals, an AI system would need some incentive mechanism for selecing which goal to persue at a given time. An emotion-like system seems like a good bet, but there are other options.

For a complex society like ours, understanding of one anothers needs, drives (and hence emotions) is neccessary for cooperating. But complex cooperation doesn't neccessarily require this, I'd argue that slime mould doesn't emote.

So, I'd say yes a bit, but mostly no.

AI would need some thing that would both drive it to complete certain goals, but also hinder it from doing things that would be harmful to its environment, people, and other AI. Emotions serve this goal in people, or at least typically do. However, if a system could be implimented that served a simular purpose, then emotions as people feel them and have them may not really be necessary. Of course if the AI self replicates, and if it even could modify it's actions and abilities, then something simular to emotion would vertainly develop to deal with problems such as motivation and cooperation. So if it were neccesary in a self replicating AI, it would ineveitably develop. It could take years or more to do so, but something to handle these problems would certainly show up.

It all depends on what we're trying to achieve.

An AI could rely on a group of sensors to detect some of our emotions. However, as long as they can't "understand" the emotion, they will probably not understand all our intends. We must take it with a grain of salt however because once we achieve strong AI, it's highly possible that it will understand how our body work. They would understand how emotions are generated and interpret our goals that way.

There are still emotions that would be important for a strong AI. Regrets or guilt are two emotions that could help drive AI on the good path. It could be represented in a tons of other way, the goal is only to help the AI to learn by experience and to learn from its errors.

Empathy and respect go well together. If for the AI, our lives are worthless, they won't have any empathy or respect for us. They will go their way and never get back, unless we're a threat to their survival, which I seriously hope we won't be. I doubt it would happen however. A strong AI would probably respect us for our achievement in creating them. We'd be their fathers, their creators. With respect would come empathy.

Is emotional understanding necessary for artificial intelligence? Plainly, no. A lot of animal are doing pretty well without it. Is it necessary for communication with us, absolutely. However, there's a good chance that a strong AI would develop this skill by itself in only a few version (generation). That's my opinion.

By Simon Dufour (not verified) on 02 Sep 2009 #permalink

@Mike

"..emotions as people feel them and have them may not really be necessary"

You make an interesting point, it makes me wonder, if emotions drive us regardless of wether or not we 'feel' them, what's the point of feeling them at all? Evolutionarily speaking; whats the adaptive advantage of self awareness? Or is sentience just a spandrel?

I think that if you want to call an AI really "intelligent" it will need some emotional (or affective) component to its programming.

If all it does is execute a run-set of commands, I fail to see how this any different to many of today's machines which are capable of running commands with very little human input.

In order to become a race of autonomous, self-replicating robots they need to have motivation. Motivation to be autonomous (otherwise all humans would have to do is reprogram them to stop, not very intelligent that) - and motivation to continue to self-replicate (otherwise why are they doing that, if they are allegedly intelligent).

If you look at good AI's from sci-fi - they all have emotive components (however much they want to deny it). HAL had his pride. V.I.K.I and a number of other 'evil' AIs have true concern for humanity. Others (Terminator) have pure malice.

Is emotional understanding a necessary pre-requisite for artificial intelligence?

No. We have plenty of limited AI currently that have no concept of emotional understanding. But as to the larger question...

Or could a race of autonomous, self-fabricating robots live without it?

Yes, I think they can live without it, but with it, they can increase their quality of "life" and vary their experiences. This variance will allow them to be more adaptable when something major happens to their way of life. An analogous situation for humans would be the Amish having the upper-hand when close-range cell-phone emissions cause mutations in all other humans, turning them into zombies.

I think, as has already been mentioned, there is a scale to empathy even within the human race, so I think it would be entirely possible to have an AI society without that trait. It might be hard to imagine, but there could be functional interactions between a group of programmed individuals based entirely on assessments of probability.

Let's say there were cities established populated almost entirely by AI, set to task to provide energy or refine resources for other more organic societies. An outsider might observe that the daily schedule of such a city would be entirely mechanical and unchanging, but this wouldn't be quite right, as the city will make small adjustments due to environment and changes in probabilities. Given stronger directives, again without necessarily requiring empathy, this change over time might be even more apparent. The more adaptable the programming, being an acceptable random element, the less harsh this AI society might appear. Novel collaborations, arguments and social institutions could conceivably appear. Internal society might still break down in this system, but that's entirely possible in an empathetic society as well. Breakdowns like this have occured and have been contained, so I don't think we have to fear destruction of the human race as in 9.

Interactions with outsiders might be tougher to guage, as those with empathy expect to see empathy returned. Perhaps we'll let the AI societies function autonomously as without sufficient understanding of motives and emotion, interactions will be awkward at best. A socially-competent probability-based AI would need to be ready for multiple reactions to an action, and especially be ready for unexpected reactions from non-AI. This could be translated to a sort of an emotional intelligence without necessarily having the same wiring as empathy. Still, something like this could lead to events as emotionally disturbing as an AI fawning over a human child because it assesses that such activity is the most acceptable in the situation, but who's to say we'll hold AI to the same standards as we do ourselves. Emotional robots may be our goal, but there will be another uncanny valley between here and there.

By ABradford (not verified) on 02 Sep 2009 #permalink

I don't think that emotional understanding is a necessary pre-requisite for artificial intelligence.

And certainly, there are some pretty simple autonomous, self-fabricating robots that live without it pretty simply - the Game of Life bots are all quite neutral, even when they die. It's fairly easy to imagine that more complex AI could work happily together too.

But I'm with Simon on the empathy and respect angle - if the AI didn't have some sense of respect for each other, then perhaps the basic "moral" rules that most humans live by would have to be hard coded: killing others is bad, if you do bad to others bad things will happen to you - which are quite practical things really.

But if we want these AIs to live with us, then yes, I think emotional understanding would be needed - for us, if not them.

The precariousness of random.org has selected
#7 Juuro
as today's winner! Juuro, email your details to winner@sciencepunk.com to collect your prize, and CONGRATULATIONS!

Next chance to win begins at midnight!