The BBC has an article about the latest computer "mind-reading" technology. It's not as sinister as it sounds: the computer is programmed to monitor human facial expressions and attempt to recognize the corresponding emotion.
Peter Robinson, professor of computer technology at the University of Cambridge, said: "The system can already cope with the variation in people's facial composition; for example, if you have a round or thin face, or if you wear glasses or have a beard.
So what are the applications of this technology? Just follow the money:
Robinson added: "Our research could enable websites to tailor advertising or products to your mood."
For example, he explained, software linked to a webcam could process a person's image, encode the correct emotional state and transmit the information to a website, which could then display products or advertising.
Robinson's web site has more details about the system. The researchers used videotapes of actors demonstrating the basic emotional states to train the computer to recognize emotions. The computer is 90 percent accurate when recognizing actors, but only 65 percent accurate with untrained individuals. (Regular CogDaily readers may recall the difficulty I had convincing them that the emotion I was trying to convey was actually pride.)
A more practical benefit of the device could be for autism / Asperger's patients: they could use a small videocamera attached to a headset or possibly their eyeglasses to help read the emotions of others and enhance their ability to interact with them.
Pardon me but I fail to imagine what kind of concrete, measurable benefits working on facial expression would bring.
It for sure it might make it easier to leave a better expression to a client or an HR interviewer but there's a good number of thing I need to think about in these situation so my facial expression is the least of my concern and beside, sales is not the place for most (all?) autistics persons regardless of the amount of training given.
As for HR interviewer, they need to keep one thing in mind, the facial expression might not reflect the state of mind of the autistic person because he's processing other details (like the decoding of the FFA from the interviewer for example, see Impaired face processing in Autism: fact or artifact by Jemel, Mottron and Dawson in the january 2006 edition of the Journal of Autism and Developmental Disorders).
I agree there can be more situations which can be touchy like these (interacting with a cop?) but it would be a LOT less costly and more efficient to teach these people to put aside any judgement of our face than trying to teach us how to look.
I think what he's saying is that it could help people with autism and Asperger's to read other people's expressions, not for other people to read the expressions on the faces of people with autism.
I think it's a frightening prospect. I don't like it when I get coupons based on the things I get at the grocery store. Being targeted because of my apparent mood is just creepy.
Ok, I stand corrected, there's definitely a few occasions where having such a videocam would have helped me (when i need to do presentations among other).
Dave, please excuse me for jumping on the gun too fast.
No problem, Alain -- I've been known to do that myself on occasion!
Sanbreakity -- yes, I agree it's scary. The whole thing reeks of a bad Tom Cruise movie.
Here my idea for "Homeland Security":
At various kiosks, (e.g. ATM's), have emotion recognition software throw in a few kinks to normal tasks. These unexpected kinks would used to test a user's emotional response. After multiple tests, at multiple kiosks, dates and times, Homeland Security will have conveniently and automatically developed personality profiles for millions of people which can be crossed linked against their phone, finance, travel, and health records. This data will then be stored on government laptops for convenient theft and use on the black market.
But would those laptops be kept in D.C. or Los Alamos?