There's been a whole lot of hype around the Hawks et al. paper describing a recent burst of adaptive evolution in the human genome. The problem is a lot of people are conflating accelerated adaptive evolution with accelerated evolution. Take this for example:
12/11: Accelerated Human Evolution
In recent years, humans have evolved at a much higher rate than previously thought, according to a new paper in PNAS. By analyzing genome variations, researchers found that the rate of human evolution was fairly stagnant until about 50,000 years ago, and then--because of larger populations, climate changes, and disease-- accelerated until at least 10,000 years ago. It probably continues at a high rate today.
That blurb was taken from the ScienceBlogs buzz from December 11. I'm not picking on the ScienceBlogs editors, but I just happened to have that quote handy. I'm sure you can find others floating around the internet. The point is, there's a fair bit of confusion regarding adaptive evolution. This is nothing new, but it warrants a blog post explaining why accelerated adaptive evolution does not mean faster evolution.
Imagine a new mutation arising in a population. That mutation, in very simple terms, will either be deleterious, neutral, or beneficial in regards to its fitness effect. If it's deleterious, it will almost certainly be lost by the next generation -- although some deleterious mutations do overcome the odds (that's why they're odds, not fates) and increase in frequency over many generations. If it's neutral, the mutation has a low probability of eventually fixing (i.e., being found in every individual after many generations). The probably of fixation of a neutral mutation is proportional to the inverse of the population size; the average time to fixation, in generations, is proportional to the population size -- based on a very simple model of binomial sampling. And, finally, you have beneficial mutations, which have a probability of fixation proportional to the fitness benefit they confer.
If you add up all three classes of mutations -- deleterious, neutral, and beneficial -- and figure out how many have fixed over the time scale you're looking at, you get the amount of evolutionary change along the lineage in question. So, to say that there was increased evolution along the human lineage in recent history implies that there was an increase in the total number of genetic changes. However, an increase in the amount of adaptive evolution (or an increase in the number of mutations fixed by positive selection), means there was an increase in the number of beneficial changes along the human lineage in recent history.
Why does this matter? Well, it's a pretty well accepted fact (thanks to work by Kimura, Jukes, King, etc.) that evolution at the DNA sequence level is dominated by neutral changes. Therefore, accelerated evolution would mean that the neutral rate of evolution increased. A change in the rate of neutral fixations would, most likely, require a change in the mutation rate. There are also some other scenarios to explain a change in the rate of neutral evolution, but that would require me to delve into the nearly neutral theory, which is beyond the scope of this discussion (see here and here).
How did they infer adaptive evolution? They didn't count up changes and assign them to each class, if that's what you're wondering. Instead, they used some common statistical techniques developed over the past couple of decades to look for signatures of positive selection in publicly available data of human genetic variation. The data set, known as HapMap, contains the genotypes of hundreds of individuals at thousands of sites in the genome that are known to be variable in human populations. Basically, which allele (or version of the site) you have at a particular site is expected to be correlated with the alleles found at nearby sites -- i.e., if I have an A at site 1000, you can predict I'll probably have a G at site 1050 -- based on data gathered from many individuals. This is known as linkage disequilibrium (the alleles at each site aren't randomly assigned).
There is a baseline level of linkage disequilibrium (LD) in any genome, and an excess of LD in a particular region needs to be explained by invoking some evolutionary force. One possibility is that recombination, the process that mixes up and randomizes the sites, is low in that particular region. Another scenario is that natural selection has led to the recent near fixation of a beneficial mutation in that region, taking the closely linked sites along with it. That leads to a large block of highly correlated alleles and, therefore, high LD in that region. The authors used this approach to identify regions of the genome with signatures of recent positive selection (i.e., high LD above the baseline level). They found many more recent events than older events, as if there were a burst of adaptive evolution in recent history.
The criticisms of the paper are based on the power to detect adaptive evolution using LD. Some people have argued that this approach is much better at detecting recent events, leading to a significant bias in the number of recent events reported. That's because LD decays over time, causing older events to be masked. Additionally, some have argued that the authors did not adequately consider the possibility that the apparent burst of recent adaptive evolution was due to chance. There's a lot of stochasticity in evolutionary systems (i.e., a lot of background noise, or "this ain't physics, bitches"), and signatures of positive selection can often be explained by purely neutral processes. To test this hypothesis, it's standard fare to run simulations using realistic parameters (mostly population size and changes therein) to determine an expected distribution of your result (or test statistic) under non-adaptive scenarios. You'd be amazed to discover the range of oddities possible under neutrality (population geneticists usually test for these oddities using coalescent simulations). Anyway, some people think the authors failed to adequately consider the possibility that their result is possible (and not just possible, but likely with a p-value>0.05) without invoking adaptive evolution.
Why would there be an increase in adaptive evolution? This has been the point that John Hawks has been hammering most on his blog (if you haven't been reading his updates, you should). Here is the theory, in a nutshell. The larger a population, the more mutations you'll find in that population (the per individual, per generation mutation rate won't change, but you have more individuals so you have more total mutations). The more mutations, the more beneficial mutations (once again, not per individual, but in total). The more beneficial mutations, the more likely that, within that population, a beneficial mutation will be found that offers an adaptive solution to something encountered by the population in its environment. Therefore, larger populations will have more beneficial mutations upon which natural selection will act and more adaptive evolution. Furthermore, natural selection is more efficient in large populations because the stochastic factors leading to the random loss off new mutations (whether they're beneficial or neutral) are counter-balanced by more alleles being visible to natural selection (this is, once again, the realm of the nearly neutral theory, which I'm avoiding in this post).
Why won't the absolute rate of evolution change? As I mentioned before, evolution at the DNA sequence level is dominated by neutral changes. The rate of neutral evolution (and the amount of neutral changes) is independent of population size. That's because the number of neutral mutations each generation is equal to the neutral mutation rate (u) multiplied by the number of haploid genomes in the population (2N, or 2 * the number of individuals, because each individual has two copies of the genome, one from mom and one from dad). The probability any single mutation fixes is equal to its frequency, which is 1/(2N) for any new mutation (it's found in a single genome out of the 2N genomes in the population). If, in each generation, you have 2Nu neutral mutations and the probability each one fixes is 1/(2N), then the rate of fixation of neutral mutations is just u (2Nu*1/[2N]).
So, Hawks and colleagues found more evidence for adaptive evolution with signatures of recent events than those with signatures that they occurred further in the past. They did not, contrary to some reports, find that human evolution has accelerated in the recent history. I'm not saying that some aspects of human evolution have not accelerated (perhaps phenotypic changes have). But it's misleading to say that "evolution" accelerated because that implies that all aspects of evolution have accelerated. Hawks et al. were working with DNA sequence data, so they could only say whether certain aspects of molecular evolution were accelerated. Even there, only adaptive evolution experienced a recent speed-up, not all of molecular evolution.
- Log in to post comments
This is a very interesting and thoughtful perspective. One problem, in my view, is that phenotype evolution and genetic evolution may operate at almost independent (seemingly) rates, with changes in rates of genetic evolution behaving, most likely, as you describe, but phenotypic evolution perhaps working more in fits and starts. I think there is evidence for more rapid phenotypic evolution in connection with agriculture, as there is probably also evidence for periods of more rapid phenotypic change at other points in human history (like with the origin of cooking possibly about 2 million years ago).
Thank you for working to make this clear. I've seen enormous confusion about this. I've tried to explain it as like the way a science (or any other language) accumulates new words as new concepts are discovered. For any given problem (in plants, any particular year/growing season) a lot of the genes just get copied and carried along but aren't the ones that make the difference between making new seed or not, that year. But when an extreme event occurs it's the presence of some of those that matter. Nature is a tinkerer and doesn't throw away anything easily. Nope, might need that someday again. Like a writer with a vocabulary or a science with a dictionary.
Even if the authors are correct that adaptive genetic evolution has accelerated, clearly phenotypic evolution has not accelerated as well. That is, most of these selective alleles must be of very little phenotypic effect indeed. For example, consider human lineage skulls over the past 2 million years. Does this sequence indicate accelerated morphological evolution? Of course not. (I'm not impressed by "accelerated" changes in a handful of cranial characters.) One could argue that the accelerated adaptive evolution is in physiology rather than morphology. Maybe. But as I said, these are probably mostly with small effect variants. With increased population size and diversified environments, there could be a lot of superficial selective churning in small effect physiological variants. (Sure, some of these - like HLA, are large effect.) But these are probably not the kind of genetic changes that lead to speciation or radical anagenic change. New superficially variant "subspecies" maybe - ones that readily hybridize. And that's what we've seen in the last 500 years.
What are the most salient differences between "races"? Skin color and facial bone structure, both neural crest derivatives. Neural crest is known for having a high degree of developmental freedom (Gerhart and Kirshner).
It's a nice little paper in its own right; however:
Accelerated morphological evolution? No, except in a few characters.
Speciation trend? Not at all.
Overhype? I think so.
Obviously, there is an upper bound on adaptive phenotypic evolution besides tiny effect variants that don't add up to much because there is an upper bound on beneficial mutations of moderate or large effect. And this bound is far lower than the background mutation rate. So at some point the faster the rate of adaptive genetic evolution, the smaller the mean effect of these allelic variants.
Adaptive mutations increase in frequency precisely because of one phenotypic effect: an increase in fitness. At least some of the other phenotypic effects of an adaptive mutation must be greater than the effect on fitness, measured in standard deviations, because the correlation of these other effects and fitness cannot be greater than unity. Therefore, we must be considering large phenotypic effects of some kind.
It is quite correct to say that some phenotypic changes are self-limiting in the long run, because of stabilizing selection. But there is no reason to think that humans reached the point where stabilizing selection stops any further change. And for many phenotypic characters there is no such limit because there is no stabilizing selection. What stabilizing selection stops the increase in frequency of lactase persistence? Or Duffy Fy*O?
The rapid evolution of many human skeletal characters in the last 10,000 years appears to have been faster than any other 10,000-year period in the Late Pleistocene. We cannot say for sure that other earlier periods of human evolution may not have had an equally rapid rate -- for instance, the origin of Homo -- although for at least some phenotypic characters they could not possibly have evolved so fast, because the absolute change was not nearly as great as the Holocene change.
The problem of comparing "rates" of phenotypic evolution has a long history in paleontology, and there is no good solution to the basic problem, which is time-averaging over long intervals. But one of the important parts of our story is that the skeletal changes are only a minor part of our recent evolution -- the majority of the genes under strong recent selection affect non-skeletal phenotypes, ranging from olfaction to sperm production, digestion, and neurotransmitter reception.
Rate of phenotypic evolution: if a character is sharply fluctuating over short time periods with periodicity of, say, several thousand years while still undergoing a subtle long term trend over hundreds of thousands of years, a short time period with lots of samples will appear to have much faster evolution than a long time period represented by fewer samples. It's an artifact of a fluctuating character that could even occur within long-term stasis. (I'm not a huge Gould fan, but he did point this out with the current superfast rate of morphological evolution of Darwin's finches.)
John Hawks: "At least some of the other phenotypic effects of an adaptive mutation must be greater than the effect on fitness"
But these effects are not necessarily at the organismic level. For example, segregation distorters.