Over at Evolgen, RPM notes an interesting study in PNAS, looking at antibiotic use and how it serves to drive the emergence and maintenance of antibiotic-resistant strains. The current paradigm for antibiotic use is to prescribe relatively high doses of drugs for a few days to a few weeks (or months, in the case of tuberculosis), and patients are cautioned to stay on them until all the doses are finished. However, the new study RPM describes suggests this may be doing more harm than good, looking at what happens with Plasmodium species treated with antimalarials in a mouse model.
Do their results overturn the paradigm? I’m not convinced. First, RPM states that antibiotic resistance is a dichotomy: either sensitive, or resistant strains. But that’s not the case–those are simply the extremes of the spectrum, with many organisms that are some shade of partially resistant to various antibiotics. (For example, while penicillin resistance is rare in Streptococcus pyogenes, it takes much higher doses of the drug to kill them today than it did, say, 40 years ago; they have “intermediate” resistance or susceptibility). As noted in the comments, it’s not only the fully resistant organisms we’re worried about when it comes to antibiotic resistance: it’s also those for which it takes a lot of the drugs to kill, but they’ll die eventually (or at least, the drugs will inhibit their growth). This study doesn’t take those into account, which is a limitation–but then again, it seems designed to be more of a paper to get fellow scientists thinking about these ideas in general, rather than an exhaustive test of every potential hypothesis stemming from them.
Either way, antibiotic resistance is certainly a huge problem, and we need to find better ways to preserve the drugs we do have. Reducing their use in this manner (lower and shorter doses) is certainly worth a second look.