Other-race faces: Why do they seem different?

This is a guest post by Rivka Ihejirika, one of Greta's top student writers for Spring 2007


i-eca0cf2af9fc3ac4445c7dff7d8aab70-research.gifDo you find it harder to recognize the face of someone from a race other than your own? Does it take you longer to recall the face of someone from an unfamiliar race? Some researchers believe that we are born with a predisposition to process faces of those from our own race better than faces from other races. Other researchers believe that the own-race face bias is not innate, but we develop a preference for the race of those in our immediate environment. People of all ages demonstrate the bias to prefer the face of someone from the same race. Yair Bar-Haim and his colleagues wanted to find how much of this own-race face bias was due to nature and how much was due to nurture.

Bar-Haim and colleagues studied the facial processing speed of infants around 14 weeks of age. The infants came from three different racial groups. The first group of infants were Caucasian Israelis who had a predominantly Caucasian upbringing. The next group of infants were African Ethiopian and were from a predominantly African environment. The last group of infants were African Israelis who were surrounded by a mixture of races in an immigration absorption center. The infants were seated on their mothers laps and shown eight face pairs of both Africans and Caucasians. The photo above shows an example of the African and Caucasian faces used in the experiment. The experimenters recorded location and duration of the infant's focus on each picture pair. Here are the results:


The longer a child looks at a face indicates the child's preference for that particular face. The Caucasian Israeli infants looked longer at the Caucasian faces than the African ones. The African Ethiopians looked at African faces longer than at Caucasian faces. If the African Israelis exhibited a preference for African faces, the role of nature in own-race face bias would have been shown. However, the African Israeli infants showed no preference for the Africans or the Caucasians. This data shows that nurture plays a significant role in race face perception.

This study shows that our environment greatly influences our perceptions. Even infants at 3 months of age demonstrate signs of racial preference, but this preference is limited to the race they are mainly surrounded by. Heightening cross racial contact mitigates the effects of the bias. Is the own-face race bias a problem? Perhaps: the bias signals a lack of diversity in surroundings. The influence of the own-race face phenomenon may carry over into our daily perception and can cause some racial prejudice beyond our direct control.

Bar-Haim, Y., Ziv, T., Lamy, D., Hodes, R.M. (2006). Nature and nurture in own-race face processing. Psychological Science, 17, 159-163.

More like this

Click on the image below to be taken to a quicktime movie showing 9 different faces. When the movie is finished playing, drag the slider back and forth to pick the face you think is the most attractive. The faces are composite images—"average" faces made by morphing together 48 different photos…
Well, the above statement, while true, is just a tiny bit beyond the peer reviewed paper I'm reporting to you today, but this paper supports the assertion and the results presented in the paper should not be a surprise to anyone. Here's the basic idea: If you focus on categorizing people into…
Take a look at these photos of Jim and Nora: They've clearly been distorted (using the "spherize" filter in Photoshop), but in opposite directions. Jim's been "expanded" to make more spherical, while Nora has been "contracted" to look more concave. If you look at these photos for a while, you…
WILLIAMS Syndrome (WS) is a rare neurodevelopmental disorder caused by the deletion of about 28 genes from the long arm of chromosome 7. It is characterized by mild to moderate mental retardation and "elfin" facial features. Most strikingly, individuals with WS exhibit highly gregarious social…

Hmmm. I agree there is a predilection, but based on my own experience, I would be surprised if it were ingrained, rather than learned.

In the early 80s I served as a Peace Corps volunteer in Kenya, in the hill country north of Nairobi. Africans are much blacker than African-Americans, and at first I had a good deal of difficulty recognizing my students and neighbors. But it was not long before it came very easily, and about three to four months later, when I went to Nairobi for a Peace Corps training conference, it was a shock to me to see so many white people, and I had difficulty recognizing all the white faces I saw.

"The longer a child looks at a face indicates the child's preference for that particular face."

This seems like a pretty big assumption about the psychology of a 14 month-old. It could be that the baby tries to categorize unfamiliar faces and looks as long as it needs to find features that allow it distinguish a face from other faces in its memory. The more novel the features of a face, the less time this would take. This would fit the data without making appeals to the infants' unknown "preferences".

By Matt Zimmerman (not verified) on 25 Jun 2007 #permalink

Anyone who has seen Steve Martin's The Jerk wouldn't be surprised by these results. ;-)

I'm sure the owners of this blog can address your concern better than I can, but it's not the big assumption that you assume. There are myriad studies showing the correlation between "stare time" and "interest" in infants.

It is true that "stare time" does correspond with "interest" but I believe, as Matt noted, it also corresponds to novelty. Since these are alternative explanations for this particular result, it would be interesting to know what the author's thoughts were about this alternative explanation.

"Preference" and "interest" are probably terms too subjective to account for the "stare time" of 3-month-olds. "Familiarity" is probably a more objective word to account for the data--infants stare longer at faces familiar to them due to the magnitude of prior exposure to such faces.

"Novelty" doesn't seem to be an appropriate label to explain "stare time" either because it would seem that novel faces would be those unfamiliar to 3-month olds (i.e., faces having features not connected to those for which they have been previously exposed). If novelty were indeed the underlying factor, then infants should be staring longer at faces not connected to their own race; the results, however, show that infants stare longer at faces connected to their own race--primarily due to greater exposure to faces from their own race, which would seem to imply familiarity rather than preference, interest, or novelty.

By Tony Jeremiah (not verified) on 27 Jun 2007 #permalink

There's some circular reasoning going on here. You can't posit that "stare time" is a measure of, say, familiarity, and that therefore, because the infants were more familiar with this or that group they stared longer at in-group members. In order for this study to mean anything, you have to _first_ establish--unambiguously, rigorously--the significance of stare time. It's not clear to me how or whether the researchers determined that longer look = greater preference, nor what "preference" even means in this case.

FYI: Andrew Sullivan, in his very popular blog, links to this post with the lead: "Bringing up a racist...it may be impossible to avoid, according to a new study" Yow!

It might be impossible to establish the significance of stare time without taking the contextual parameters of a study into consideration, since stare time might imply different mental/psychological states, again, depending on the parameters of a particular study. For example, Judith Langlois has conducted research showing infants stare longer at faces rated attractive by adults relative to faces rated as less attractive. Preference (rather than novelty, interest, or familiarity) seems a defensible psychological construct that explains the longer stare time in this instance because it is compatible with a psychological state adults are likely to be in when looking at an attractive face. This is relative to something like explaining the stare time as a result of familiarity, because it does not make sense in any direct way, why an attractive face would be more familiar than a less attractive one.

Familiarity seems to be an appropriate psychological construct to explain stare time for the study presented here, because it would be compatible with what an adult would do if they were in a crowd filled with mostly strangers, but then you run across a "familiar" face (e.g., an acquaintance from work). One does not have to have a preference or interest in such a person, nor is the face a novel one. It would essentially be the equivalent of saying, "Hey, there's someone I know".

However, to remove the seemingly tautological nature of stare time, one possible way to do this might be to examine the brain activity of infants while they stare. As an example, greater activation of the amygdala (associated with emotional processing) during longer stare time, might correspond with "preference", while greater activation of regions involved with face recognition might correspond with "familiarity".

In either case, one would still have to take the parameters of the study into consideration.

By Tony Jeremiah (not verified) on 27 Jun 2007 #permalink

To quickly address a point raised in a previous post, attractive faces are in fact more likely to be mistaken as familiar faces, as research on the mere-exposure and related phenomena (in this case something of a reverse mere-exposure effect) demonstrates (see work my Monin for instance)

That said, starting time may be an index of novelty or ease of categorization, even in infants. For example, if an infant can assign a cross-race face to a category based on skin color, this processes can likely be done quicker than individuating an unfamiliar, yet same-race face. Work my Levin (1996; 2000) presents evidence of this kind of "feature selection" process in adults, such that white participants are quicker to categorize black faces as black than they are white faces as white. Also, Rodin (1987) has proposed a "cognitive disgrard model" regarding face processing. although again this model has been tested on adults, if infants are simply less interested in cross-race faces, they may indeed spend less time looking at them as a result of essentially disregarding the stimulus.

In the end of course, extending any of these findings to infants is likely a huge stretch.


Interesting tidbit about the attractiveness/familiarity relationship. The mere-exposure effect did cross my mind and makes me wonder if the research cited here represents a special instance of a mere-exposure phenomenon, such that constant exposure to specific stimuli enhances likability for the stimulus. Although, a primary difference is that mere-exposure research usually involves repeated, subliminal exposure to stimuli.

I find interesting also, the comment about feature selection and the faster categorization of other-race faces--that seems consistent with the in-group, out-group homogeneity effect, related to the stereotype that persons from other races seem to be the same (both in personality and appearance). So the study could suggest that "shallow" processing of other-race faces might develop early. As a side note, details of the study cited here also indicate that the infants are not attending to color because they showed no difference in staring time when they attended to pink and brown ovals, so there's most likely facial feature processing happening.

To directly test for novelty, I think the study would work better using a habituation procedure, which explicitly tests how new a stimulus seems to a child. In this instance, rather than showing a bunch of different faces randomly, the study should use a procedure where they show the same face repeatedly until the child habituates (no longer looks at) that face. Then show a different (novel) same-race or other-race face. If they show longer staring time for another same-race face, that would seem consistent with the disregard hypothesis and would suggest preferential processing of same-race faces. I have not read the studies you cited, but I guess they must be along these lines.

By Tony Jeremiah (not verified) on 28 Jun 2007 #permalink

"The longer a child looks at a face indicates the child's preference for that particular face. "

That's a pretty outrageous statement for a researcher to make, and I'm impressed that most of the posters realized it and seemed to question this sort of assumption.

By roseindigo (not verified) on 28 Jun 2007 #permalink

As a side note, details of the study cited here also indicate that the infants are not attending to color because they showed no difference in staring time when they attended to pink and brown ovals, so there's most likely facial feature processing happening.

An interesting follow-up experiment would be to digitally alter photographs of people of different races such that their skin color is the same hue. Hmm.

Doesn't stare time have to do with novelty? I heard about this study a little while ago and it still puzzles me.

Considering the children were sitting on their mothers' laps, could their mothers have subconsciously influenced their stare time by leaning in a certain direction?

I'm puzzled about why American researchers refer to 'race' instead of 'ethnicity'. To me, a New Zealander, ethnicity is the more appropriate term because it describes the phenotypic characteristics of an individual (or at times, a group); race almost seems to infer a separately identified genus.

I am not sure about the experimental results, but children stare both for novelty and sameness. In sameness, the critical factor must be the time gap involved between subsequent see-ings.

The preference for faces of their own kind is probably not unique to infants alone. But infants do have that kind of bias. In india where skin tone varies within communities and between states, children seem to not only prefer their own toned adults but also get scared of people with darker tones or beards, even without any prior experience of experiencing any harm from such face types. Must be something internal.

You can run similar experiments on adults and probably reach the same conclusions. We had a project execution team of various European nationalities down here in India. Some of them faced this problem. Exasperated, I pointed out to a French guy one day that a colleague of mine was a bit darker than another colleague and that would be an easier way to identify him. His reply was, all faces here look the same to me.

Considering that Indian faces show a whole lot more variation than say Caucasian faces, particularly in skin tone and shape, I was surprised. I thought that it was a problem peculiar to him, but when I went to an East Asian country, all the faces there looked the same to me. So apparently the effect is common, or were we the two people different, that we did not have a yen for faces?