Competitive Release and Antibiotic Resistance

While much of the research in evolutionary biology is purely academic in nature -- designed for the purpose of understanding the biology of a system rather than for immediate human benefit -- there is some research that yields immediate practical uses. One research area that is particular fruitful in this regard involves applying evolutionary theory toward combating infectious disease agents. The metaphor of an arms race is often used to describe the evolutionary dynamics of parasites and their hosts. This true for the naturally evolved reactions in hosts, but also for human engineered treatments such as antibiotics.

Blogging on Peer-Reviewed Research

The evolutionary approach toward fighting infectious microbes with antibiotics has led to a few accepted paradigms in the public consciousness. For example, in industrialized western nations, doctors are advised to prescribe antibiotics for only a small, select group of patients. Those patients receive high doses of antibiotics and are told to finish the prescribed bottle of pills even if they begin to feel better. We'll get to the theory behind this in a moment, however the basic idea is that this approach is intended to prevent the spread of antibiotic resistance amongst the pathogenic microbes.

The advantage microbes gain from their innate adaptability is augmented by the widespread and sometimes inappropriate use of antibiotics. A physician, wishing to placate an insistent patient who has a virus or an as-yet undiagnosed condition, sometimes inappropriately prescribes antibiotics. Also, when a patient does not finish taking a prescription for antibiotics, some bacteria may remain. These bacterial survivors are more likely to develop resistance and spread. Hospitals also provide a fertile environment for antibiotic-resistant germs as close contact among sick patients and extensive use of antibiotics select for resistant bacteria. Scientists also believe that the practice of adding antibiotics to agricultural feed promotes drug resistance. [The Problem of Antimicrobial Resistance, NIAID Fact Sheet]

The point I have emphasized in the quote above is the focus of this essay, and you can find such advice in many other places. Other precautions are also employed to prevent the spread of antibiotic resistance -- the unnecessary use of antibiotics (i.e., for viral infections) and the strong selective pressures in hospitals are both major public health concerns. But does evolutionary theory support the idea that patients should finish the long antibiotic prescriptions as they are currently advised? Does this approach prevent the spread of antibiotic resistance or does it encourage it? A recent paper in PNAS addresses this issue using a mouse model of malaria, and the authors come to a conclusion that goes against the accepted paradigm.

Microbes can be classified as resistant to antibiotics or sensitive to them. Resistant individuals can tolerate the presence of an antibiotic, while sensitive ones are unable to reproduce in such conditions. The current approach of antibiotic treatments is designed to purge a host of a large swath of microbes. This results in the near total elimination of sensitive strains, which allows for an environment devoid of competitors for the resistant strains. The theory upon which this approach is based focuses on eliminating new resistant strains from arising by removing any lineages which may give rise to resistant strains. However, antibiotic resistance often arises via very simple mutational processes (sometimes even by a single mutation, and in the case of malaria, by a series of mutations that are very accessible).

In their new paper, Wargo et al. argue the removal of sensitive bacteria from a host releases the resistant individuals from competition for various intra-host resources. They support their claim with data from experiments in which mice were infected with Plasmodium chabaudi, a relative of the parasite which causes malaria in humans, P. falciparum. The mice were either infected with a strain of P. chabaudi resistant to an antimalarial drug, a strain sensitive to the drug, or co-infected with both strains. The amount of pathogens from each strain within an individual was inferred using qualitative PCR and primers that were specific to each strain.

i-943c723eadfa66cfdc962beb039a07c7-Wargo_et_al_fig2.jpeg

In the two graphs shown above, the number of resistant individuals is plotted for four experiments (the two graphs represent two different life stages of the parasite). In each graph, the solid line represents data taken from individuals infected with only the resistant strain, while the dotted line is from individuals infected with both the sensitive and resistant strains. The two data points are for individuals that were not treated with the antimalarial drug ("0 Days drug") and those treated with the drug ("4 Days drug").

As you can see, in the individuals that were not treated with the drug, the mice infected with only the resistant strain have either more or the same number of resistant clones within them as the individuals treated with both the resistant and sensitive strains. That's because, in the co-infected individuals, the two strains are competing for limited resources. This competition limits the total number of resistant individuals present. However, when the inter-strain competition is removed (via treatment with the drug), the number of resistant clones in the co-infected individuals surpasses that found in the individuals infected with only the resistant strain. Therefore, competitive release (i.e., the removal of competition with the sensitive strain) increases the amount of resistant pathogens in the host relative to hosts which were not treated with antibiotics.

This raises a concern about antibiotic treatments -- you can actually increase the number of resistant pathogens found within a co-infected host by administering antibiotics. But the Wargo et al. were also interested in examining how different antibiotic regimens affected parasite density within the host. They performed a similar experiment as the one described above, but treated the mice with the antimalarial drug for either 0 days (no treatment), 1 day, or 2 days.

i-38f6804c1c0865995849675ffde74ecd-Wargo_et_al_fig4.jpeg

One day of treatment for co-infected mice increased the amount of resistant parasites relative to untreated individuals. However, co-infected hosts and hosts infected with only the resistant parasite had equal numbers of resistant pathogens after a single day of treatment. It was only after two days of antimalarial treatment that competitive release became apparent.

These results suggest that heavy doses of an antibiotic actually increase the number of resistant pathogens relative to more moderate treatments. This goes against the current paradigm of giving lots of antibiotics to an individual when they come down with an infection and urging them to complete the prescription regardless of whether they begin to feel better. This approach is designed to prevent resistant alleles from arising in a population, but it also gives resistant strains, once they've arisen, a very large selective advantage over sensitive strains.

Population genetic theory predicts that the greater the selection pressure in favor of a new beneficial mutation, the faster that new allele will increase in frequency, eventually becoming fixed in the population. The two graphs below show how fast an allele will fix in a population depending on the strength of selection. With strong selection (top graph) the resistance allele fixes faster that with weak selection (bottom graph). In the malaria example, strong selection is giving out lots of antibiotics and having patients take them for a long time.

i-3c0e2b8309970fc4682fb95689775048-malaria_seln.gif

The current paradigm doesn't apply the strongest selection pressure possible. For example, doctors are discouraged from prescribing antibiotics for viral infections. Additionally, other over-prescriptions of antibiotics are frowned upon. However, the large dose given to patients applies a stronger selection pressure than if more moderate doses were prescribed. These large doses are designed to prevent the appearance of mutations conferring antibiotic resistance, but these mutations arise regardless. Rather than attempting to extend the period until a resistance mutation arises, the authors argue that it would be more fruitful to prevent the dramatic increase in the resistance allele that occurs when large doses of antibiotics are prescribed.

Here's where you're supposed to ask: that's great, but what about the illness experienced by the infected individuals? In other words, so what if there is a lower frequency of resistance, what if more people are sick if you give smaller doses? Well, it turns out that there isn't a difference in mortality between the mice treated with low doses of antibiotics (1 day) and those treated with high doses (2 days). That means, in this model, you can prevent the spread of a resistance allele without causing undue suffering in the infected individuals. If you keep the dosage low, the antibiotic is effective for much longer without a cost in the health of the host.

Should this approach be implemented for human malaria and other non-viral infections? Hell no! But this approach toward antibiotic use should be examined in humans. If we can extend the life of an antibiotic without health consequences, much money will be saved and human suffering will be reduced. That's because once the resistance allele reaches fixation, the antibiotic is no longer useful. This can't be examined in humans using the same controlled experimentation as in mice, but I wonder if some clever epidemiologists can come up with a way to determine whether this approach is feasible.


Wargo AR, Huijben S, de Roode JC, Shepherd J, Read AF. 2007. Competitive release and facilitation of drug-resistant parasites after therapeutic chemotherapy in a rodent malaria model. PNAS 104: 19914-19919 doi:10.1073/pnas.0707766104

More like this

I'm not convinced at all by the experimental design. It's simple common sense that if a patient [i]does[/i] have resistant strains in them, then antibiotic treatment will not clear the infection. And in fact, competitive release may even make the infection worse. If you have a gut infection with a resistant bacterium, then clearing out all the normal flora with the "wrong" antibiotic will make the patient more severely ill.

As I understand it, the argument for giving long courses of antibiotics is to prevent [i]new[/i] resistant strains arising as a course of treatment. Even within a so-called "sensitive" strain, there will be some bacteria that are more or less sensitive than others. You need to treat for long enough to kill off the "slightly sensitive" bacteria as well as the "very sensitive" bacteria. Otherwise, the infection is not cleared and will return once you stop the antibiotics. That means you need to treat [i]again[/i], which with repeated cycles is hypothesised to lead to yet further selection for higher levels of resistance.

The experimental design should not have looked at strains that are known to be completely insensitive to the antibiotic, but should instead have examined strains that are both sensitive, but differentially so.

By Peter Ellis (not verified) on 14 Jan 2008 #permalink