Emerging disease and zoonoses #7--reporting on emerging diseases

In addition to all the science of H5N1, several presentations were given discussing communication between scientists and the public (or those who more often communicate with the public--science journalists). As I've written on here before, it's not an easy dance to figure out, for a variety of reasons discussed below.

As anyone who follows science reporting in the mass media knows, it can seem at times that each new study contradicts the last. Something is good for you; yesterday it was bad for you. Something is the new treatment of tomorrow based on early studies; but sorry, it was shown not to work in larger ones (or had unexpected side effects). So it is with infectious disease reporting. Certainly there is much argument about "hype" over H5N1, and any kind of infectious disease like this--where there's the potential for a lot of loss of human life, and a lot of concern amongst professional scientists, but not yet a large outbreak of human disease. Unfortunately, even when we have much information about the epidemiology of a microbe (which, as I mentioned, we don't with H5N1), we lack a crystal ball. Our predictions can never be 100%--and obviously, the accuracy will decrease for organisms where information is sparse. That's why you see predictions as low as 10 million deaths worldwide should H5N1 become a human pandemic, and as high as 10 times that or more.

This isn't a problem only with H5N1--it's pretty universal to emerging infectious diseases in general. Because they are either new to us, or old diseases that are in a state of flux due to acquisition of novel virulence factors or antimicrobial resistance genes, our knowledge about their epidemiology and ecology is minimal. Accurate predictions are difficult or impossible to make at such a point in the game--all we can do is rely on historical events that parallel the disease as closely as possible, and extrapolate from there. Scientists realize this--we know that we'll be ask for numbers, and the best estimates will be provided, but those estimates are only as good as the data we have available. This data may, unfortunately, be very sparse.

A drawback of such predictions and soundbites in the mainstream media is that it has led to some (somewhat understandable) mistrust on the part of the public. There are a number of problems with reporting, however, and often times it can be a lose-lose situation for scientists. For example, what if the CDC receives a report that H5N1 has surfaced in the U.S.? Should it be reported immediately to the public--which would provide transparency--or wait until it's been confirmed, which would provide more certainty but potentially evoke an outcry of cover-up in the time between initial discovery and eventual report? Heads I win, tails you lose.

There can also be different expectations and assumptions on the part of the public versus scientists. For instance, it sometimes seems as if many in the public--and indeed, even many journalists--assume that the CDC and others in public health do real-time tracking of, well, every disease on the planet, and that good, 100% accurate diagnostic tests exist for every infectious agent so that their identity can be confirmed at a moment's notice. Of course, this is not the case. It can be weeks or longer for reports of an outbreak to filter up to the CDC, or even a state or local public health organization. Many local departments are short-staffed and underfunded, and many outbreaks go uninvestigated. This can lead to frustration on the part of the public--especially those who feel public health authorities "ignored" their illness. One place they may go for assistance--the media, resulting in further criticism heaped upon scientists.

As this demonstrates, it can be difficult to balance the needs of journalists and the media with the interests of public health. The media needs something new and newsworthy, where public health reports are more thoughtful and considered. (Just read almost any journal article and then the media write-up on it. Although some are very good, other times I wonder if they're covering the correct story.) And as I've mentioned, aargh, the headlines...sometimes it's enough to make you want to poke your eye out. To pick on poor Seed recently (hey, they're local), contrast this article with this one from EffectMeasure on the same topic--recent papers published in Science and Nature suggesting a reason that H5N1 hasn't become readily human-to-human transmissible is because it binds to sugars on cells located in the lower portion of the lung. It's not that Seed's article is bad--it's to-the-point and I didn't see any glaring errors (though the headline isn't one I'd have chosen). It's just that it's necessarily simplistic, and uncritical of the research, while Revere goes to great lengths to point out that, while the new papers are interesting, we don't really know yet if that's the main reason or not--we need follow-up studies.

And this is the problem with much of journalism. The scientific literature is much less certain than the lay press. We use lots of qualification, especially for preliminary findings. We emphasize that more research is needed. We point out the weaknesses of our study design or other methods. This doesn't happen very often in write-ups that appear in newspapers or magazines, for understandable reasons--word limitations, different target audiences, etc. But nevertheless, it makes journalists who write about science both the friends and foes of scientists--a good journalist can be very helpful in communicating science to the public, but it only takes one poor one in a high-profile venue to cause a lot of confusion and misunderstanding.

The challenge regarding emerging diseases, as it was called, is "two-footed driving." In other words, "be concerned, but not too concerned." Raise awareness and educate without scaring--it's not an easy task, as I've mentioned before; but I do think it's a worthy one for those involved, even if the correct chord is a difficult one to strike.

More like this

As a person who covers science (albeit mostly for a local public radio station) and who is a (soon to be ex) scientist as well, I straddle this issue.

It would be great if everyone who reported on science and medicine had a background in those fields, but that's an unrealistic expectation when so many other factors determine success in journalism.

Most of the information in your average science story comes from a one-sided press release written by the university public relations office (who often employ trained science writers) and a brief interview with someone who conducted the study that is often paired down to a quote along the lines of "this study is important because [insert last sentence from discussion section here]." This doesn't necessarily produce a bad story, but it produces a story lacking nuance, which is by and large what good science journalism captures.

But not every American is interested in reading the Science section of the NY Times or listening to NPR's Science Friday.

Most people are interested in science, especially as it relates to their health, and science journalists must walk a tight rope, maintaining people's interest while simultaneously explaining science without dumbing it down to the point of obfuscation.

To reinforce your point about the reporting, just compare the headlines for the recent vaccine announcement, which seem to run the gamut from "we have a vaccine" or "promising vaccine" to "vaccine a failure"

http://news.google.com/news?tab=wn&ned=us&hl=en&ie=UTF-8&q=flu+vaccine&…

Problem is, people trying to interpret such noise are given the opportunity to draw their own conclusions, which in this case will be consistent with whatever their assumptions already are.

If you assume that bird flu is not a threat because there will be a vaccine, you'll see this story and say "see, we have a vaccine."

If you assume bird flu is a threat because an effective vaccine will be prohibitively difficult to develop, you'll see this story and say "no vaccine, I told you so"

Nuance has a way of getting lost in the mix.

Science is about constantly challenging assumptions, but getting the public to reconsider cherished assumptions is next to impossible.

By Michael Donnelly (not verified) on 04 Apr 2006 #permalink

Hello, I would like to ask all of the brilliant people reading this to please provide some feedback concerning the images and statements being made in a video provided on the site www.silentsuperbug.com. Any and all thoughts concerning this would be greatly appreciated.

Thank you
Dan Smith