How computers can make humans like them

More and more human conversations are taking place online. While I don't do instant messaging the way my kids like to, I'm much more likely to contact a friend via e-mail than to pick up the phone. Here at Cognitive Daily and at other online discussion forums, I've built relationships with commenters who I've never seen or even e-mailed.

While the next leap in online communications—videoconferencing—is in its infancy, an intermediate form is beginning to show promise. Called a Collaborative Virtual Environment (CVE), it enables people to have a virtual online conference by creating digital representations of everyone they're meeting with. Instead of sending video images across the Internet, only voice and a little data about movements are transmitted. Computers on either side of the connection translate the motion data into a realistic animation of an avatar—an electronic image of each conferee. Even if fast Internet connections eventually allow widespread true videoconferencing, CVEs will still be necessary for situations when a fast connection isn't available, such as when one conferee is using a cell phone.

This brings up a serious issue: could one or more members of a CVE hack the network, sending motion data designed to win over the other participants? It's not as far-fetched as it sounds. Research on face-to-face interaction has shown that people who mimic the gestures of the people they talk to are judged to be more likeable than those who don't. In an online setting, a conferee could program his avatar to behave differently for each person viewing the conference, custom-mimicking the other conferees for maximum likeability.

But perhaps likeability doesn't work the same way in a virtual environment. Jeremy Bailenson and Nick Yee developed an experiment to test if people respond to computers the same way they respond to humans. They gave students course credit to watch a persuasive presentation "read" to them by a computerized virtual reality embodied agent. The agent looked like either a man or a woman, and read the same script (a persuasive speech advocating requiring students to carry ID on campus at all times) in a corresponding male or female voice. The VR equipment allowed the researchers to monitor the body movement of the students participating in the experiment. For half the participants, the head and body movements of the agent mimicked the motions of the viewer—only delayed by four seconds so participants didn't notice. The other half of the time, the agent used the motions of another viewer, recorded during a separate session. After the session, viewers rated the agent on three different dimensions: social presence (how realistic the agent appeared); whether they agreed with the agent's proposal; and their overall impression of the agent—how positively they viewed the agent. Here are the results:

i-70a597fc5f92aa6632a580f8d041b8c3-agent.gif

For all three measures, participants viewed the mimicking agent as more effective than the agent that displayed recorded movements. The viewers were all aware that this was simply a virtual reality presentation—that there was no real person behind the avatar—and yet they still found the mimicking agent to be more effective. So it does appear that a simple computer program can manipulate complex social behavior. Perhaps as people get accustomed to CVEs, they will become more aware of the possibility of social manipulation, but in the short run, this experiment shows the potential for danger in computer-mediated communication.

Balenson, J.N., & Yee, N. (2005). Digital chameleons: Automatic assimilation of nonverbal gestures in immersive virtual environments. Psychological Science, 16(10), 814-819.

More like this

"this experiment shows the potential for danger in computer-mediated communication."

It also shows the potential for helpfulness. Does the original study focus solely on the danger? Doesn't the same mimicing influence perceptions when real people do it?

dunno, if the CVE is being used in business, is "gaming the CVE" any different than, say, using a program to manage eBay auctions? or manage a portfolio?

but the experiments are critical to know about, nonetheless. i'd get nervous if political parties began communicating using VCEs with small groups of voters.

Hmm. Seems to me that if two humans are conversing and one mimics the body language of the other, that's a very clear indication of which one is dominant and which subordinate. Most of us like being at least a little dominant, so submissive behavior is appreciated - something that lower-ranking individuals, salespeople, & others have long learned to manipulate.

It would be interesting to find some habitual "follower" types and see if their reactions differ from "normals"; maybe even more interesting (& potentially scarier) to model a CVE avatar's movements after the body language of a habitual "alpha" to find whether test subjects display mimicry to what they see.

By Pierce R. Butler (not verified) on 06 Nov 2005 #permalink

[...] I stumbled across an interesting post on the Cognitive Daily this morning, about likeability in computer-mediated environments. I've long maintained that mimickry is a major aspect of socialization in online settings ¡ª from mimickry in terms of lingo or in terms of assholic behavior on some forums I've been a part of ¡ª and this study gives some empirical support to such claims. A small study, but interesting. [...]

[...] The Washington Post has an article about the growing popularity of avatars. Gives some good information about this growing trend, but doesn't really touch on research suggesting that it's easy to deceive people using avatars. [...]