Facial sensations modulate speech perception

What sensory cues do we rely on during the perception of speech? Primarily, of course, speech perception involves auditory cues - we pay close attention to the sounds generated by the speaker. Less obviously, the brain also picks up subtle visual cues, such as the movements of the speakers mouth and lips; the importance of these can be demonstrated by the McGurk effect, an auditory illusion in which the visual cues accompanying spoken words can alter one's perception of what is being said. 

Integration of these auditory and visual cues is essential for speech perception. But according to a new study published in the Proceedings of the National Academy of Sciences, the process involves more than meets the eye and ear. The new research shows that the perception of speech sounds can be modified by stretching the skin on the face; it suggests that the brain integrates tactile sensations as well as audiovisual cues during speech perception, and leads to the surprising conclusion that the somatosensory system can influence the processing of speech sounds.

Lead researcher Takayuki Ito of the Haskins Laboratories in New Haven, CT, and his colleagues recruited 75 native American English speakers for the study. The participants were presented with a series of ten auditory stimuli, consisting of a computer-generated continuum between the words head and had. In each trial, the words were presented one at a time in a random order, and the participants were asked to identify which word they heard. During the task, the researchers used a robotic device (below) to stretch the skin on the participants' faces. This was done in 3 directions (up, down and backwards), in such a way that the stretching resembled the facial skin movements involved in producing one of the stimulus words.


robot.jpg


It was found that perturbing the skin altered the perceptual boundary between the two words. Stretching the skin at the sides of the mouth upwards increased the probability that the presented word was perceived as head, while stretching it downward made the stimuli sound more like had. This effect was greatest with intermediate words in the continuum, but had little effect on the words at each end, which could be easliy identified as either one or the other. It was also found to be dependent on time - speech perception was only altered when the duration of the skin stretches was comparable to that of the movements involved in producing the words.

Earlier work has suggested that speech perception and production of speech are closely linked to each other and may have a common neural basis. For example, one study showed that watching speech movements and listening to speech sounds facilitated responses in the lip area of the motor cortex, while in another, repeated disruption of the premotor cortex with  transcranial magnetic stimulation was found to affect performance on a speech perception task. The brain is known to contain pathways connecting the somatosensory and auditory regions, but the significance of these connections was unclear. And, despite evidence that  the somatosensory system is involved in speech production, its role in speech perception has not been explored.

This new study shows that speech-like patterns of facial skin deformation affect the way in which people hear and perceive speech sounds, and therefore points to a significant involvement of the somatosensory cortex in speech perception. It extends the earlier findings by providing evidence that speech perception is associated not only with the motor cortex but is also affected by the tactile sensations that would normally be generated with the production of speech. The observed effect may therefore be a consequence of inputs from the face region of the somatosensory cortex to the face region of the motor and premotor cortices. This is in line with the motor theory of speech perception, which states that we perceive speech by referring to "motor images" of how phonemes (individual speech sounds) are articulated.  

The authors use this theory to explain their findings. They note several recent studies in which it was reported that stretching the skin on the limbs in a way that corresponded to normal movements elicited sensations of movement in the participants, and suggest that something similar happened in their study. Thus, stretching the skin at the side of the mouth activates receptors under the skin surface which are sensitive to mechanical pressure.; the information transmitted from the receptors to the somatosensory cortex modifies the brain's representation of the position of the lips and also the motor image of how the presented word is articluated, thus altering how it is perceived.


Ito, T. et al (2009). Somatosensory function in speech perception Proc. Nat. Acad. Sci. DOI: 10.1073/pnas.0810063106.

Categories

More like this

The way we perceive other people has a big influence on how we interact with them. For example, attractive people are more likely to be perceived as talented than less attractive people, and this so-called "halo effect" is often reflected in our behaviour towards them. Similarly, we tend to favour…
We've discussed synesthesia many times before on Cognitive Daily -- it's the seemingly bizarre phenomenon when one stimulus (e.g. a sight or a sound) is experienced in multiple modalities (e.g. taste, vision, or colors). For example, a person might experience a particular smell whenever a given…
Music can be thought of as a form of emotional communication, with which the performer conveys an emotional state to the listener. This "language" is remarkably powerful - it can evoke strong emotions, and make your heart race or send tingles down your spine. And it is universal - the emotional…
Synchiria is a neurological condition in which a stimulus applied to one side of the body is referred to both sides. If, for example, one's left hand is touched, he experiences tactile sensations on both hands. People with intact brains do not experience this, probably because of inhibitory…

I take it you saw this study as well? I like the motor theory hypothesis, as it gives a neat evolutionary pathway from gestural to spoken language (and explains why signing is such an intuitive language!)

Mirror neuron involvement?

"The importance of these can be demonstrated by the McGurk effect, an auditory illusion in which the visual cues accompanying spoken words can alter one's perception of what is being said."

I'm one of the rare individuals that doesn't experience the McGurk illusion. Are there any explanations for why some people might not experience it?

By MKandefer (not verified) on 04 Feb 2009 #permalink

MKandefer: I can't find or think of an explanation, but it's very rare for any such phenomenon to be experienced by everyone, so it might just be due to individual differences.