Earlier this month I wrote two posts about the evolution of the eye, a classic example of complexity in nature. (Parts one and two.) I'd like to write now about another case study in complexity that has fascinated me for some time now, and one that has sparked a fascinating debate that has been playing out for over fifteen years. The subject is language, and how it evolved.
In 1990, Steven Pinker (now at Harvard) and Paul Bloom (now at Yale) published a paper called "Natural Selection and Natural Language." They laid out a powerful argument for language as being an adaptation produced by natural selection. In the 1980s some pretty prominent scientists, such as Stephen Jay Gould, had claimed that the opposite was the case--namely, that language was merely a side effect of other evolutionary forces, such as an increase in brain size. Pinker and Bloom argued that the features of language show that Gould must be wrong.
Instead, they maintained, language shows all the classic hallmarks of an adaptation produced by natural selection. Despite the superficial diversity of languages, they all share a basic underlying structure, which had first been identified by Noam Chomsky of MIT in the 1960s. Babies have no trouble developing this structure, which you'd expect if it was an in-born capacity rather than a cultural artefact.
This faculty of language could not simply be a side-effect of brain evolution, because it is so complex. Pinker and Bloom compared language to the eye. No physical process other than natural selection acting on genetic variation could have produced a set of parts that interacted so closely to make vision possible. And you can recognize this adaptiveness by its similarity--in some ways--to man-made cameras. Likewise, language is made up of syntax, the anatomy for producing complex speech, and many other features. Pinker and Bloom argued that natural selection favored the rise of language as a way for hominids to exchange information--whether that information was about how to dig up a tuber with a stick, or about how a neighboring band was planning a sneak attack. There was nothing unusual about the evolution of language in humans; the same biological concepts could explain it as could explain the evolution of echolocation in bats.
Pinker and Bloom went on to publish a number of papers exploring this idea, as well as some popular books (The Language Instinct and How the Mind Works from Pinker, and Descartes's Baby from Bloom.) But they by no means spoke for the entire community of linguists. And in 2002, one particularly important linguist weighed in: Noam Chomsky.
It was the first time Chomsky tackled the evolution of language in a serious way, which is surprising when you consider how influential he had been on the likes of Pinker and Bloom. He had offered some vague musings in the past, but now he offered a long review in the journal Science, which he coauthored with two other scientists. One was Marc Hauser of Harvard, who has carried out a staggering amount of research on the mental life of primates, and the other was Tecumseh Fitch of St. Andrews University in Scotland, who studies the production of sound by humans and other animals. (You can read more about Fitch's work in an essay I wrote for Natural History.)
The Hauser et al paper is not an easy read, but it has its rewards. The researchers argue that the only way to answer the question of how language emerged is to consider the parts that make it up. They see it as consisting of three systems. Our ability to perceive the sounds of speech and to produce speech ourselves is one (the input-output hardware, as it were). Another is a system for understanding concepts And the final ingredient of language is the computation allows the brain to map sounds to concepts.
Hauser et al see three possible explanations for how this three-part system evolved. One possibility is that all three parts had already evolved before our ancestors diverged from other apes. They introduce this hypothesis and then immediately abandon it like a junked car. The second possibility they introduce could be called the uniquely-human hypothesis: the language faculty, including all its components, has undergone intense natural selection in the human lineage. Pinker and Bloom's argument fits this description. The final hypothesis Hauser et al consider is that almost everything essential to human language can also be found in other animals. Perhaps only a minor addition to the mental toolkit was all that was necessary to produce full-blown language.
The authors point out that a lot of the data that would let them to choose between the three have yet to be gathered. Nevertheless, they devote most of their attention to the almost-everything hypothesis, and it's clearly the one they favor.
They argue that studies on animals already show that they have a lot of the ingredients required for language. Monkeys, for example, can have comprehend some surprisingly abstract concepts. They can understand number and color, for example. As for the input-output hardware for human language, it's not all that special either. Monkeys are so good at recognizing human speech sounds that they can tell the difference between two sentences spoken in different languages. And as for speech production, the researchers argue that the essential anatomy is not unique to humans, either.
Humans, for example, depend on a low larynx to give them the range of sounds necessary for speech. But did the larynx drop down as an adaptation for speech? In an earlier paper, Tecumseh Fitch showed that other species have lowered larynxes, including deer. What purpose does it serve for these nonhumans? Fitch suggests that it began as a way to deceive other animals, by making an individual sound larger than it really is. Human ancestors might have evolved a lower larynx for this function, and only later did this anatomy get co-opted for producing speech.
Hauser et al make a bold suggestion: perhaps only one thing makes human language unique. They call this special ingredient recursion. Roughly speaking, it's a process by which small units--such as words--can be combined into larger units--such as clauses--which can be combined into larger units still--sentences. Because units can be arranged in an infinite number of ways, they can form an infinite number of larger units. But because this construction follows certain rules, the larger units can be easily understood. With recursion, it's possible to organize simple concepts in to much more complex ones, which can then be expressed with the speech-producing machinery of the mouth and throat.
According to the almost-everything hypothesis, all of the components of language may not have all gradually evolved together as an adaptation. Instead, much of it was already in place when recursion evolved. It's even possible, they suggest, that recursion didn't even evolve as part of language, but for another function, such as navigation. By happenstance, it also fit together with the other elements of language and voila, we speak.
The Hauser et al paper got a lot of attention when it first came out, such as this long article in the New York Times. Steven Pinker offered a few cryptic comments about how Chomsky's huge reputation didn't leave much room for those who accepted some of his ideas but dismissed others.
But he would not be content with a couple bite-size quotes. Working with Ray Jackendoff of Brandeis University, he began work on a long reply. It has only now appeared, over two years later, in the March issue of Cognition. (But you can grab it here, on Pinker's web site.) This 36 page retort is remarkable in the sustained force with which it blasts Hauser et al. It's not just a regurgitation of 15-year old ideas; Pinker and Jackendoff marshall a lot of evidence that has only been gathered recently.
While Hauser et al may claim that speech perception is not all that special, Pinker and Jackendoff beg to differ. They point out that we use different brain circuits to perceive speech sounds and nonspeech, and that certain kinds of brain damage can cause "word deafness," which robs people of the ability to perceive speech but not other sounds. Babies also prefer speech to non-speech at an early age, and when they show this preference, language-related parts of their brain become active.
What about speech production? Again, Pinker and Jackendoff argue that humans show signs of adaptation specifically to produce speech. Humans learn to speak by imitation, and are astonishingly good at it. But humans are not good at imitating just any sound. A parrot, on the other hand, can do just as good a job at saying Polly and doing an impression of a slamming door. As for Fitch's ideas about the lowering of the larynx, even if it were true, Pinker and Jackendoff don't think it goes against their hypothesis. Even if the larynx had an earlier function, that doesn't mean that natural selection couldn't have acted on it in the human lineage. Bird wings got their start as feet that reptiles used for walking on the ground, but these limbs obviously underwent intense natural selection for flight.
Pinker and Jackendoff then explore some other aspects of language that Hauser et al didn't address at all. The first is the fact that language is built from a limited set of sounds, or phonemes. Phonemes are crucial to the infinite capacity of language, because they can be combined in so many ways. But they also require us to understand rules about how to pronounce them. Pinker and Jackendoff illustrate this with the phoneme -ed: in the words walked, jogged, and patted it has the same meaning but has three different pronunciations. As far as Pinker and Jackendoff can tell, primates have no capacity that can be compared to our ability to use phonemes. As for why phonemes might have evolved in the ancestors of humans, they point to some fascinating models produced by Martin Nowak of Harvard. Nowak argues that natural selection would favor just a few phonemes because they would be easy to distinguish from one another. Human language lets us say thousands of words without having to understand thousands of individual speech sounds.
Research on language genes are also consistent with a uniquely-human hypothesis, according to Pinker and Jackendoff. A gene called FOXP2, for example, is essential for language, and any mutation to it causes difficulties across the board, from articulating words to comprehending grammar. What's more, comparisons of the human FOXP2 gene with its counterparts in other animals shows that it has been the target of strong natural selection perhaps as recently as 100,000 years ago. If the only new feature of language to evolve in humans was recursion, then you would not expect FOXP2 mutations to do anything except interfere with recursion. They also point out that broad comparisons of the genes in humans, chimps, and mice, suggest that some genes involved in hearing may have undergone intense natural selection in our lineage. It's possible that these genes are involved in speech perception.
Pinker and Jackendoff even take issue with the one part of language that Hauser et al granted as being unique to humans: recursion. Recursion is just a basic logical operation, which you can find not just in human language but in computer programs and mathematical notation. But all humans have a capacity for one special sort of recursion: the syntax of human language. Pinker and Jackendoff declare that the case for the almost-everything hypothesis is "extremely weak."
At this point, I might have expected their rebuttal to come to a close. But instead, it takes a sudden turn. Pinker and Jackendoff find it puzzling that Chomsky would offer the almost-everything hypothesis when the facts go against it and when Chomsky himself had laid the groundwork for the uniquely-human hypothesis. For an answer, they burrow into Chomsky's head. They offer a survey of Chomsky's last decade of research, which has been dedicated to finding the indispensable core of language. As Pinker and Jackendoff describe it, Chomsky's search has led him to a single operation that combines items, which I'll nickname "Merge."
I won't go into all the details of their critique here, but the upshot is that Pinker and Jackendoff aren't buying it. By reducing the essence language to repeated rounds of Merge, Chomsky has to push aside all the things about language that linguists have been spending decades trying to figure out, such as phonemes and the order of words in sentences. The reason that they bring up Chomsky's recent work (which Chomsky calls the Minimalist Program) is because they think it is the source of his views on the evolution of language. Our pre-language ancestors may have simply been missing one thing: the Merge operation.
Pinker and Jackendoff are appalled by this. In fact, they hint that some of Chomsky's ideas about language have a creationist ring to them. Chomsky has said in the past that in order for language to be useful at all, it has to be practically perfect. How then, he wonders, could it have evolved from simpler precursors? Chomsky even likens language to a bird's wing, writing that "a rudimentary wing, for example, is not "useful" for motion but is more of an impediment. Why then should the organ develop in the early stages of evolution?"
"What good is half a wing?" is a question often asked by those who reject evolution. But it is a question with several possible answers, which scientists are currently investigating. Feathers may have evolved initially as insulation. Even stumpy wings could have helped feathered dinosaurs race up into trees, as it helps birds today. Likewise, Pinker and Jackendoff argue that language evolved gradually to its most elaborate form. In fact, imperfect languages still exist. Pidgins allow people to communicate but lack fixed word order, case, or subordinate clauses. Pinker and Jackendoff argue that modern language may have emerged from an ancient pidgin through evolutionary fine-tuning.
In sum, Pinker and Jackendoff conclude, their ideas about the origin of language fit with the evidence from both linguistics and biology, and those offered by Chomsky, Fitch, and Hauser don't.
Now what? Do we have to wait another two years to see whether Chomsky, Fitch, and Hauser crumble under this attack or have something to say in response?
As I'll explain in my next post, the answer, fortunately, is no.
Good stuff. Eagerly awaiting Gab II.
P.S. The working link to the Pinker & Jackendoff paper.
DNA-based adaptation occurs over generations of individuals. Language as the vector of meme transmission accelerates adaptation and, in fact, will obviate DNA-based adaptation. Once these memes are used to control DNA through genetic engineering (genetic CAD?), then a heuristic loop will have been closed that will allow genetic hyperadaptation and preadaptation to novel xenoenvironments. (Not to mention that memes will allow us to preadapt the environment to the DNA (e.g., teraforming), instead of vice-versa.)
I don't really know what I'm talking about, but the argument that "recursion is just a basic logical operation, which you can find not just in human language but in computer programs and mathematical notation" doesn't make any sense to me (especially since these are things based on human logic). Now, arguing that monkeys were good at recursion would be one thing, but computers . . . ?
If a single genetic change could have allowed for registering the universality of the universal, recursiveness could have built up after that. One word utterances, at the beginning, would have been useful, if they had been words with universal import, and not just names with associational learning appendages. Chimps can learn many names, and successfuly associate them with their objects. They never generalize this to the point of conceptualization, because they lack the genetically coded capacity to proceed with the universality of the universal(concept). There is a leap from the associational learning to the conceptual, and no indication of intermediate stages between the two. Modern science or biology, unfortunately has a metaphysics to which it is absolutely committed; that associational and conceptual learning are not really different. The reason for this commitment, which conflates the human and the subhuman, might be that it allows public institutions to make propaganda for treating people like animals.
I think most people would agree that, compared to chimps, humans have a very advanced ability to concatenate and associate words. Humans also have a qualitatively greater ability for conceptualization.
However, the blogosphere, and in fact posts within this very blog, serve as delightful demonstrations of the human potential for associating words without necessarily producing clear and complete conceptualizations.
Computers are not just "good" at recursion, that's what they're best at. It's pretty much the basis of the usefulness of computers to begin with !
A direct link to the NYT times article is this.
The notion that language would have evolved because its capacity to exchange information confers enormous adaptive advantages is a bit too pat, even, -dare I say it?- scientistic. There is a cognitivistic bias that runs through much of linguistics itself, as if thought and cognition must be separately and internally constituted and language is just their external expression. (Granted, the relation between thought and symbolic language is a complicated chicken-and-egg conundrum.) But an alternative hypothesis or conjecture is possible. Humans did not develop language and thereby become sociable. Language emerged rather on the basis of a long evolution of biological sociability carried by analog-relational systems of communication that grew increasingly complex and intensive, and the digital aspects of language were built over the analog-relational aspects, perhaps as a "solution" to a crisis of over-complexity that threatened breakdown, indeed, as that very breakdown, and the digital components perhaps were not organized all at once, but rather developed themselves through a rather lengthy process. In other words, the primary function of language is not to transmit information, but rather to establish and maintain relationships, and the enormous adaptive advantages conferred by the communicative exchange of information were, relatively speaking, a by-product. (A glance at speech act theory, with its emphasis on the modal-relational and contextual aspects of language use rather than linguistics proper, which tends to emphasize the digital components as distinctive, would be to the point here.) After all, while the adaptive effects of the emergence of fully linguistic communication must have been enormous, they would also have brought with them whole new levels of complexity and (mal)-adaptive problems, such as self-consciousness, the awareness of death, the intentional relation to otherness, paranoid anxiety, etc.; in other words, the problem of binding or anchoring identities in a context of social integration, that is, the coordination of social action. But, above all, the fallacy that language is one "thing", together with the correlated notion that its "cause" could be traced to a "blueprint", should be avoided. The variety of quite distinct things at different levels that feed into our language capacity would suggest a complex co-evolution of different features, with diverse selection pressures that were perhaps somewhat contradictory in their adaptive implications.
It seems to me that writing has developed very recently, only a few thousand years. Is writing part of or independent of the language ability discussed here? Tracing the development and evolution of writing may yield more definitive conclusions.
Another linguist's comment on the recursion hypothesis is here.
Sorry, wrong link -- I meant this.
Prokaryotes are the most successful organisms on the planet, there are more of them in mass and number than any other organism and they are the unique inhabitants of many extreme environments (hot, cold, high and low pressure etc).
So why would they get together to form Eukaryotes? And why would Eukaryotes form multi-cellular organisms?
There is probably no answer that covers all the coalitions and symbiotic groupings that exist on the planet today not all are advantageous to survival as the better survivability of the independent organisms attests but it is certainly a common thread and a common trend in the evolution of life.
So why would it suddenly stop at independent organisms such as humans? Certainly any form of further coalition is as unique in form as the Eukaryote is to the Prokaryote or the multicelled to the singular. But when we ask about the utility of language and of the coalition so formed we may see the same structure forming imperative applied once more.
Humans are quite capable of surviving without language, and numbers did not immediately swell with the success of verbose tribes. But individuals could share parts of their mind formally uniquely theirs and entirely personal.
Experience, such as what one has seen over the horizon or ones encounter with a particular animal or other tribe can be shared. Ones technique for making tools, their useage, the reason for keeping and not discarding some items (in my experience, it will be useful later on) all allow single minds to coalesce and become what we call culture, a culture far beyond anything seen in other animals.
The energy and time saving is enormous not every individual must learn everything by personal experience, must invent every tool afresh, must become familiar with every individual personally, and advantageous habits do not have to wait for serendipitous mutations of genetic material to be included in the behaviour pattern of humans.
Non-language human precursors can do everything we can, accept to form the intellectual corpus that forms the centre of ever growing human coalitions. Without language, the tribe of a few individuals, no more than would be found among chimps, would be the limit.
Language does in nature what it does for nature independent calls are gathered together to form cognitive objects of a greater scale than any single word can possibly achieve: independent individuals gather together their thoughts and form a corpus of knowledge that no single individual could possibly think by personal experience alone. The syntax of human coalition is the moral and behavioural protocol that allows the meshing of independent wills for any common purpose.
Kind regards,
Robert Karl Stonjek
There's a confusion in the above summary between phonemes (sounds, often spelled in English with a single letter like 's' or 'd'), and morphemes. The latter include roots (like 'cat') and affixes (like the '-s' in 'cats' or the different affix '-s' in 'walks'). Morphemes are usually composed of one or more phonemes (although rarely, affixes may consist of something less, like a modification of a sound on the word they attach to). And a single morpheme may be pronounced in different ways.
The particular example in the text above, the suffix -ed (and mis-labeled as a phoneme), is composed of a single phoneme in words like 'walked' (where the suffix is a sort of 't') or 'banged' (where the suffix is a 'd'), or two phonemes in a word like 'waited' (where it is a schwa vowel plus a 'd').