Face recognition is a task which humans do with little effort, even though in fact it’s a tremendously difficult problem. To recognize a face, we need to be able to ignore traits that change over time, while focusing in on details that remain constant. A simple computer program, for example, would have difficulty recognizing that Jim frowning (before his fries arrive) is the same person as Jim smiling (after his fries arrive).
The fact that we have little difficulty recognizing our friends and family regardless of their facial expressions has led researchers to speculate that recognizing faces is accomplished by an entirely different system from recognizing facial expressions.
But in 1998, S. R. Schweinberger and G. R. Soukup found that we take longer to recognize emotions in different individuals than in the same person. Yet we recognize the individuals at the same speed, regardless of facial expression. They argued that recognizing emotions must not be entirely separate from recognizing individuals. However, an alternate explanation is possible: perhaps faster tasks can interfere with slower tasks, but the reverse does not occur. Since classifying identity takes less time than recognizing emotion, then this may be the root of the difference.
A team led by Anthony Atkinson has developed a set of experiments to test whether the interaction between identity processing and emotion processing is related to processing speed. They asked 32 college students to look at a series of photos of faces and identify either their emotion (fearful or happy) or gender. For the emotion group, half the time the gender of the faces remained the same, and half the time the gender changed from photo to photo. The same procedure was reversed for the gender group. Here are the results:
The results matched those of Schweinberger and Soukup: When classifying emotions, then reaction is slower when the gender of the face in the photos changes, but changes in facial expression don’t slow gender classification. Again, this could be explained by the fact that gender classification is faster than emotional classification.
To address this discrepancy, Atkinson’s team developed a second experiment where the difficulty of the gender and emotion tasks was systematically varied. Faces from each extreme of sex or emotion were blended together at different levels. For example, in one condition, viewers saw faces that were 75 percent male and 25 percent female and vice versa. The reaction times for this condition turned out to be the same as when emotions were displayed at the 75-25 level. Here are the results for this group:
Now, as before, the reaction time was slower during emotion classification when the gender of the faces changed. Reaction time actually decreased during gender classification when the emotion changed. Atkinson et al thus offer convincing evidence that face recognition and emotion recognition must be relying on at least some of the same mental processes.
The implication, of course, is that face and emotion recognition are extremely complex processes — and indeed that the two processes are to some extent intertwined. While it’s not true that Jim is literally a different person before he gets his French fries, there’s no doubt that he sometimes appears that way to others!
Atkinson, A.P., Tipples, J., Burt, D.M., & Young, A.W. (2005). Asymmetric interference between sex and emotion in face perception. Perception and Psychophysics, 67(7), 1199-1213.