In Star Wars, the real hero might be R2D2 -- the only character who makes it through all six episodes without falling to the "dark side" of the force. R2D2 is a robot, but everyone in the film treats "him" like a person, even commending him for "bravery." As viewers, we don't have a problem with that. R2D2 was Jim's favorite character -- he even had a stuffed animal version of the robot to sleep with.
But as real robots become more a part of society, will we form human bonds with them? It's already happening; the Washington Post has the details. U.S. soldiers who regularly use robots as minesweepers or for other battle duties do form connections with the bots:
"Sometimes they get a little emotional over it," Bogosh says. "Like having a pet dog. It attacks the IEDs, comes back, and attacks again. It becomes part of the team, gets a name. They get upset when anything happens to one of the team. They identify with the little robot quickly. They count on it a lot in a mission."
The bots even show elements of "personality," Bogosh says. "Every robot has its own little quirks. You sort of get used to them. Sometimes you get a robot that comes in and it does a little dance, or a karate chop, instead of doing what it's supposed to do." The operators "talk about them a lot, about the robot doing its mission and getting everything accomplished." He remembers the time "one of the robots happened to get its tracks destroyed while doing a mission." The operators "duct-taped them back on, finished the mission and then brought the robot back" to a hero's welcome.
Fascinating stuff, and even more fascinating when we realize where robotics is headed. We discussed some of the implications on CogDaily last year. As robots become more involved in personal care, as nurses, maids, and nannies, it will almost certainly be difficult for us not to see them as humans. The implications are staggering, and we'll continue to follow this thread on CogDaily as it develops.
My housemate has a Roomba robot vacuum cleaner, and we've definitely assigned a bit of personality to it. We often tell it what a good job it does cleaning the floor. (It really does. I was surprised and impressed at how well it works.)
Other types of robots, though, don't seem to be as easily anthropomorphized. I read blogs with Google Reader, which is essentially a robot that goes out and collects blogs for me. I haven't assigned personality to it yet, though.
I am no war expert, but doesn't this sound like the same attitudes the service men had toward their bombers when they would name them, paint special graphics, etc?
Or how about any car aficionado that names their car and treats it like a "pet"?
In this case, the robot is an essential remote tool. In the absence of a functioning robot, direct approach would be necessary to disarm or disrupt the IED -- certainly not a desirable outcome for the bomb techs given other funded alternatives. Secondly, some team members get invested in a particular skill -- the skillful control of the robotics in a situation requiring finesse and gentle maneuverings; if the robot fails to do "its" job through malfunctions, human team members must take the alternative risk.
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
Pathos or pathetic? I never knew our military was so sensitive over hardware.
He took a couple of detonations in front of his face and didn't stop working. One time, he actually did break down in a mission, and we sent another robot in and it got blown to pieces. It's like he shut down because he knew something bad would happen.
I've heard that EOD techs can get superstitious about some aspects of their work.
This sort of life- or mind-attribution to robots is really fascinating, and I think it sheds a bit of light on some of the issues surrounding AI. It is interesting how low our threshold of "like-us-ness" (above which we begin to anthropomorphize things, like we do with dogs and cats and even other people), and also how embodiment seems important in this empathetic anthropomorphizing (as the example about Google Reader vs. Roomba shows quite well).
We have two iRobot products. However, we do not have a Roomba and a Scooba. We have Roomba and Scooba.
A good term for this phenomenon might be "Turing Bonding."
Bonding with machines is nothing new. I've bonded with the various cars I've had for years, including giving them names, and feelings of loss when I have to get rid of one. So why not bond with a robot? However, my son keeps reminding me that, "it's JUST A MACHINE, so quit getting emotional about it Mom." Intellectually I certainly agree with him, but emotionally there is a feeling of "attachment". Strange!
Soldiers become so machine-like that I wonder if they don't have a harder time distinguishing between actual machines and human beings.
This is reminiscent of many examples throughout cog sci of attributions of agency, or at least intentionality (and thus leading to the development of concepts of personality and attachment to same) to inanimate or unconscious entities (there is some research on whether god concepts form in this way). A favorite example are experiments involving describing a scene in which shapes move around and appear to interact, and participants reporting "his" and "her" interactions on the screen. A more interactive version is the robot simulation software BugWorks, which I recall having to work on before attempting more elaborate physical implementations. when multiple bugs were interacting, it was almost inevitable that we would attribute intentionality to the interactions, especially such behaviors as chasing (seemed like mating to some).
I think this raises a question - is the difference between object anthropomorphization (e.g. cars, ships, etc.) and bonding with robots the fact that robots display an increasingly apparent degree of goal-directed (and depending on the definitions and implementations, perhaps intentional) behavior? Such behaviors appear to have special resonance in humans, and are probably some of the basic elements of human social bonding. As robot movement and appearance becomes more lifelike, i think it will be inevitable that people treat them more like equal agents.
And this is a very big deal, as even iRobot is getting into developing military grade field robots that may be deployed in very large numbers - the "robot stock news" blog has some very interesting information about about the development of, and soldier interactions with, the iRobot PackBot. Looks like some very interesting times ahead.
p.s. got my Scooba today. My grandfather, who is 83, is so thrilled to say "I have a robot", and he can't wait to brag about it to all of his friends back in Boca. "Remember the Jetsons?"
This discussions reminds me of a piece that Paul Bloom wrote for the Atlantic Monthly titled (provocatively) "Is God an Accident?" where he presents the idea that perhaps people are hardwired to understand things in two very different ways: objects (like a rock) and agents (like a person). Here's a link to the piece.
Giving a robot a name or assigning it a false sense of personality does not make a robot social. The reality is that robots are nowhere near being as social as pets, they are still just products. Yes some people will get attached to robots but it is because they have reason to interact with them on a regular basis. With the exception of the programmer, the only real attachment is when people can associated with another pet or human. Given that the attachment to the robot is not based on the sophistication of the robot but the memory association. That is the same as being attached to a smell or a song that reminds an individual of another person such as a friend or a significant other.
In the case of the colonel, he associated the robot with his crew. Thus the comment of it being inhumane. This does not make him patho or pathetic, it just means he is capable of being empathetic. How is that a bad trait?