There have been some interesting new developments in the study of the evolution of language. The idea that human language emerged from hand gestures rather than sounds has been getting very popular in recent years. Some scientists think that certain neurons in the brain played a crucial role in this gestural prehistory. Known as mirror neurons, they simulate the movement of other people's hands, among other things. In the October 14 issue of Science, a team of scientists showed that mirror neurons are even more sophisticated than previously thought. They even speculate that some mirror neurons could have helped give rise to different components of sentences, such as subjects and verbs.
One reason that this gestural theory has gained ground is that it hasn't been clear that the sounds of apes refer to particular things. Random outbursts aren't a particular good foundation for building up language. But don't rule out the sounds just yet. In the journal Current Biology, primatologists have shown that when chimpanzees make noises known as "rough grunts," different grunts signify different things.
We're still at the very early stages of this sort of research, and I wouldn't be surprise if future work shows that there's not an either/or choice when it comes to the precursor of full-blown language. I've written a piece about all this for a special section of Forbes.com about communication. It's great company to be in, with articles and interviews from the likes of Steven Pinker, James Suroweicki, and Arthur C. Clarke.
While I'm on the subject of the evolution of language, a non-biological step forward has occurred over at Language Log, a popular blog about linguistics. They've taken on my brother Ben Zimmer as a regular contributor. Here's his latest, on the wordplay on the new Steven Colbert show on Comedy Central.
- Log in to post comments
First, congrats on your Forbes article, and to your brother on his new venture.
As might be expected, I'm very fascinated with the various theories of language evolution. One other thing to think about is that there appears to be a Broca's area correlate in the brains of some monkeys that "activates" when apes vocalize. My personal theory is that both gestures and vocalizations were involved in the birth of language. The REAL question, for me, is how language came to be located primarily in the left hemisphere, and how that fact may relate to handedness in hominids.
Since apes do have hand preference, do they gesture with their "off" hand? Some folks believe that homo sapiens used to be primarily left-handed, which would allow the right to be used for gestures. If enough gesturing came to be associated with meaningful vocalizations, this COULD, POSSIBLY be an explanation for the asymmetry.
Still, the grunts could denote level of excitement rather than refer to actual food stuff, leading to the same chimp response, as the authors point out as well..
If this link between hand gestures and verbal structure can be confirmed, what kind of implications does this have for evolution?
I agree with Carl's prediction that the precursors of human language are probably both vocal and gestural-- no need to dichotomize as some theorists do. One of the most interesting books on this question is, still (after a decade) Armstrong, Stokoe and Wilcox's 'Gesture and the Nature of Language' because it cogently explains how syntax may have evolved via gesture. I do think the subtlety and nuance of ape gestural interactions are still very much overlooked (The Dynamic Dance, Harvard University Press 2004, is where I make this case.)
Good article -- I agree entirely that this need not be an either/or question. In fact, I think that the key to understanding the evolution of language will be to resolve this excluded middle mentality and look at how both speech and 'gesture' are gestural -- one producing an acoustic signal, the other an optical signal. The neuroscience data clearly points in this direction. A recent report by Maurizio Gentilucci on how grasp observation influences speech production is one example of the deep neurological connections between hand gestures, visual perception, and speech.
Gestural phonologists obviously provide an important framework for conceptualizing speech as essentially complex motoric gestures. Cognitive linguistics is an important piece of the puzzle too, because it reveals the significance of visual perception and visual cognition in the grammar of all languages.
Two recent books explore gestural theories along these lines. David McNeill's "Gesture and Thought" (U. of Chicago Press, 2005) is the culmination of his decades of research on gesture and speech. In "Vision to Voice" (Oxford University Press, in press), David Armstrong and I present the case that visible gesture played a key role in the origin and evolution of the human capacity for language.
Five years ago I attended a talk by Sally Boysen (Ohio) in which she spoke about a study she was preparing to publish that was similar to the rough grunt study that recently came out of Scotland. Boysen said she had determined that the chimps in her lab could distinguish different classes of foods based on vocalizations of other chimps. I've looked for the published paper but have not seen it yet.
The theorisers of the first human "language" should listen to modern discourse among some Africans who use a series of clicks in their very old, retained languages. OEA (Our Early, Earlier, Earliest Ancestors) simply used the same manner (and later, form). Once the evening small fire was enjoyed, signs in the gloom gave way to more indicative phonemes, and HAD to have reciprocal meaning in order to maintain social cohesion, expand the imagination and tell lies.
But it was not "language", but very suitable communication for survival and expanded that greatest of listeners -- oneself.
If language evolves in the direction of ever increasing abstraction and independence from face-to-face communication (as happens with the development of writing, then print, etc.), it makes sense to think of early forms of language (which must be verbal, if it is ordinary human language we are referring to) as closely tied to specific and concrete situations and to face-to-face communication, in which gestures, sounds, and overall understanding of the situation at hand go together.
It's interesting to me that theorizing about the role of gestures in language development leads folks to apes instead of to Deaf communities and their sophisticated sign languages ... I know there has been neurological research by the Salk Institute that demonstrates the brain essentially 'doesn't care' whether it gets verbal or visual language: "The capacity of brain systems to subserve language, regardless of modality, is a striking example of neuronal plasticity."