…or at least, I was mentioned in Cell. Currently, we have very few new antibiotics in the pipeline, particularly for Gram-negative bacteria such as Acinetobacter. One of the things I’m involved with is a project to develop a not-for-profit screening library to find new antibiotics. From Cell:
Resistance of microbial pathogens to an increasing number of antibiotics is a serious problem. In the US alone, 90,000 people die every year from infections acquired while in the hospital. According to the Infectious Disease Society of America (IDSA), 70% of these deaths have been attributed to infection with drug-resistant bacteria, in particular methicillin resistant Staphylococcus aureus (MRSA). Compounding the problem, the World Health Organization warned in September of a new form of tuberculosis, XDR-TB, caused by a multidrug-resistant strain of Mycobacterium that leaves patients virtually untreatable with current anti-TB drugs.
Despite this threat, the pipeline of new antibiotics approved by the US Food and Drug Administration (FDA) is running dry. The number of new antibiotics is now about 60% lower than in the mid-1980s, says Brad Spellberg, an infectious disease specialist at Harbor-UCLA Medical Center in Torrance, CA. “It’s a straight line down,” he says. Since the 1960s, only two new classes of antibiotics have been introduced in the clinic, linezolid in 2000 and daptomycin in 2003, says Jun Wang, a senior biochemist at Merck Research Laboratories in Rahway, NJ. The IDSA estimates that about a dozen new antibiotics are in late clinical testing. But most of them, says Spellberg, are “me-too” drugs comprising modifications to existing compounds or members of known classes of antibiotics. “That doesn’t help us treat drug resistant bacteria,” Spellberg points out, because they are often not sufficiently different to overcome resistance.
Pharmaceutical companies are less interested in developing antibiotics than drugs that treat lifelong diseases because people only take antibiotics for a short time, notes Spellberg. “There has been at least as many drugs developed over the last 12 to 13 years for HIV as compared to all bacterial infections put together,” he says. “It’s all about money.” It takes 250-500 patients treated with an antibiotic for every patient on a medication for a chronic disease to get the same return on investment, says Christopher Spivey, manager of business development at the nonprofit Alliance for the Prudent use of Antibiotics (APUA) in Boston, MA. “That’s why companies have been walking away.”
The reason that the antibiotics pipeline is running dry is not only money but also because the search has become more challenging. The first antibiotic, discovered by the British microbiologist Alexander Fleming in the 1920s, came from a mold, Penicillium notatum. Since then, soil-dwelling microorganisms have been the traditional source of antibiotics. But searching for antibiotics the old way–culturing soil bacteria and screening them for compounds they produce that kill bacterial pathogens–means that the same antibiotics are discovered over and over again, in part because those already identified are potent and highly concentrated, says Merck’s Wang. We have run out of soil bacteria that are easy to culture, says Kim Lewis, a microbiologist at Northeastern University in Boston, MA: “As with a gold mine, you mine it out and it ends.”
Screening Goes up a Notch
Some companies are moving away from a dependence on soil bacteria, instead screening libraries of synthetic compounds for their antimicrobial properties. Pfizer researchers are using the genomic sequence information of different bacterial strains to identify bacterial survival genes. They then screen millions of synthetic chemicals to find those that interfere with the products of these essential genes. This approach has yielded three new compounds that are now in clinical trials and a few more that will be soon, says Paul Miller, head of Therapeutic Area Research for Antibacterials at Pfizer.
But this strategy is not always successful. Between 1995 and 2001, GlaxoSmithKline (GSK) did 70 high-throughput screens of synthetic chemical and other libraries for inhibitors of essential bacterial targets. The success rate was 4-5 times lower than with mammalian cell targets, says David Payne, director of microbiology at GSK in Collegeville, PA. One reason, he says, is that bacterial enzymes are harder to inhibit because they have evolved for longer and are well suited to harsh conditions. Wyeth had a similar experience. “Having had a similar degree of futility to GSK using high-throughput screening, we are certainly not going to do that ourselves in the future,” says Steven Projan, vice president for biological technologies at Wyeth Research in Cambridge, MA. “Right now we are doing very little antibacterial drug discovery.” Screening efforts may fail because many compounds cannot get into bacterial cells or are toxic to mammalian cells as well as bacteria or because bacteria have transporter proteins that can pump out synthetic compounds.
Given the mixed success of large screening efforts, it may be difficult to get big pharmaceutical companies interested. That’s where APUA plans to help by developing a not-for-profit screening library that would be funded by corporate sponsors and public money. This would spread the risk, and companies would have to pay less for the initial screening effort, says Michael Feldgarden, APUA’s research director.
The rest of the article is in the Dec. 1 issue of Cell.