This post is written by a special guest - Ivan Oransky, executive editor at Reuters Health, who I had the pleasure of meeting in person at Science Online 2010. I was delighted when Ivan accepted my invitation to follow up a recent Twitter exchange with a guest-post, and shocked that he even turned down my generous honorarium of some magic beans. Here, he expounds on the tricky issues of journalistic balance and how journalists can choose their sources to avoid "he-said-she-said" journalism. Over to him:
The other day, a tweet by Maggie Koerth-Baker, a freelance science journalist in Minneapolis, caught my eye. In it, she bemoaned the fact that editors and producers often encourage their reporters to go find an "opposing viewpoint" to make a story balanced. She said her journalism school professors -- she graduated in 2004 -- always told her the same thing.
That troubled me.
I've been teaching medical journalism at New York University's Science, Health, and Environmental Reporting Program
since 2002, and I taught a similar course at the City University of New York's Graduate School of Journalism for three years. As I told Maggie and the others
having the conversation on Twitter, I never tell my students to get "opposing viewpoint" but to get outside perspective -- one that may agree with the study or the main idea being put forward by a source.
Koerth-Baker said she liked
that way of framing it, and evidently so did Ed Yong
, so the idea for this guest post was born.
It's easy to see why opposing viewpoints often rule the day. People like tension, and good journalists like skeptics. People who feel strongly about something are often media-savvy. They know how to give soundbites. They're often telegenic -- think Jenny McCarthy.
But I don't have to tell you how this can lead to false balance. Others have written convincingly on this before, notably my NYU colleague Jay Rosen
. In science and health reporting, you can end up with this
.
Clearly, if the only sources you can find to "oppose" a study's findings are from a scientific fringe, the best "opposing" viewpoint may be one that agrees!
However, just saying that isn't always enough, especially when a reporter is on deadline. So here's how it works, after which I'll offer some tips on finding those sources.
Two recent examples: About a month ago, I wrote a story
based on a study of whether Arcoxia -- a cousin of Vioxx -- could treat "Yom Kippur headache," which something like 40 percent of observant Jews suffer every year.
Now, having written a weekly column in the Jewish newspaper The Forward ten years ago, I had some aces in the hole. The one I went to immediately was Edward Reichman, a triple threat: rabbi, emergency room doctor, and medical ethicist. As always, Reichman was happy to talk to me, and gave me some comments for the story. He thought the study was well-done and interesting -- hardly an "opposing viewpoint," but one that added important context for the study.
A few weeks ago, I found a study in Circulation
that said maybe being religious wasn't that good for your health after all. I can hear many readers rolling their eyes at the idea that we needed to study this, but there's actually a small but significant literature on the subject, much of which says religious people tend to live longer.
The studies are hardly perfect, so when a study like the Circulation one appears, it seems worth reporting on. Basically, the authors drilled down into data to look at the specific link between "religiosity" and clogged arteries.
Since I had written about this for the Boston Globe, Salon
, and -- not surprisingly -- The Forward, I knew the perfect source: Duke's Harold G. Koenig
, who has studied the potential religion-health link for decades. This time, Koenig gave me a scientific critique of the paper that you probably would consider an "opposing viewpoint." Here's my report
.
But, you're saying, of course it's easy to find those sources when you've got more than a decade's worth of reporting experience. So let's say I had come to both stories as a novice. How would I have tracked down outside comment?
The first place I would have looked is in some of my favorite sentences in a study's introduction -- the ones that start something like "Previous studies have shown..." Like all of the good reporters I know, I have a healthy appreciation for the fact that research is an ongoing process. A study, or two, or 100, came before the one I'm reporting on. The nice thing about the medical literature is that authors cite those studies, and -- imagine this -- those studies have authors. (That's another reason to always read papers you're reporting on, instead of relying solely on press releases.)
Take Yom Kippur headache. There isn't a huge literature on the subject, but it turns out the phenomenon was first described in a 1995 paper in Neurology
. I could have called the authors of that study. (The paper also cited the Bible. Admittedly, it would have been tough to call the author of that "study.")
(The only solid research on treating Yom Kippur headache
-- using Vioxx, before it was pulled from the market -- was done by the same group that tried Arcoxia, so that wouldn't exactly have been an "outside perspective." Imagined interview: "Yes, this work is fantastic! It is some of the best I've seen on the subject. Well, second only to that study we already published.")
Sometimes, of course, those studies don't agree with the paper in your hand. Great -- in that case, it's probably appropriate to quote an outside source who happens to have an "opposing viewpoint." That would have been the case if I had come to the Circulation paper on religion and health de novo. The paper dutifully noted the large body of literature that had found religious people tend to live longer, and cited more than 10 of those studies. Koenig's name shows up.
If you strike out in the paper itself, go to Medline to find others who've written papers on the subject. I would have found this review of Yom Kippur headache
, which wasn't cited by the paper I was writing about.
Strike out there? Try specialty medical societies or foundations, many of whom have someone with a title of something like "vice president for research" who is highly qualified to read a paper for you and commentOne of the most helpful when I've been on deadline is the Leukemia & Lymphoma Society
.
A caveat: Groups like that have an agenda, if an obvious and transparent one. They want more funding to treat and cure their diseases. But they're still excellent sources.
Sure, there's Google, too, which is certainly useful. But these approaches have a much higher signal-to-noise ratio.
Other reporters tell me they often ask study authors whom else they could call for comment. That's not a bad idea if you're in a pinch on deadline, but for me it's often a choice of last resort. As I tell my students, it's like asking profile subjects for people to comment on their life or work. It's just human nature to name people who are likely to be complimentary, and scientists are no exception.
The nice thing about the search for a source is that it also provides data. At the end of the day, if it's hard to find a source that thinks a study is deeply flawed, it's probably not deeply flawed. That knowledge informs how you write the story, and makes you more confident in quoting sources who think there's something worth reporting.
There are other tips, of course, some of which I hope show up in comments on this post. I hope journalists don't think I'm giving away any "trade secrets" here. But since I'm sure many someones taught me how to do this along the way -- starting at my college paper
-- I don't own the approach. To me, most of this is just common sense. And since I think part of the future of science and medical journalism is teaching its methods and principles to smart skeptical experts who aren't full-time journalists -- many of whom already know these techniques -- I think this is a perfect forum for this kind of discussion.
Ivan Oransky is executive editor of Reuters Health and treasurer of the Association of Health Care Journalists. He also teaches medical journalism at NYU's Science, Health, and Environmental Reporting Program. You can follow him on Twitter at @ivanoransky
More on journalism at Not Exactly Rocket Science:
- Rebooting science journalism - on blurring boundaries, money, audiences and duck sex
- Rebooting science journalism - thoughts from Timmer
- Adapting to the new ecosystem of science journalism
- Who are the science journalists?
- Breaking the inverted pyramid - placing news in context
- On cheerleaders and watchdogs - the role of science journalism
- Does science journalism falter or flourish under embargo?
- Log in to post comments
Ivan provides a lesson that all journalist could benefit from. The media here in the US has a problem with science in particular, but that, I feel, is a symptom of a larger problem.
Great article, Ivan! I'm passing it around...
Thanks for the post. I wonder if the idea of including an opposing view point comes from training young journalists to write local stories, basic political conflict of interest pieces that often involve government bureaucracies and politicians and their motivations. In these cases, perspective naturally comes from including opposing views.
I never went to j-school, but I learned quickly that in covering science it's important to get perspective by looking through past research and talking to others in the field, and even those in neighboring fields. I've found my share of over-hyped stories this way.
Something that's missing from a lot of science stories (but almost always found in press releases from a university or institution) is the funding. If it appears in an article, it's usually a throwaway line. However, you can often find an interesting angle to a story if you follow the money: it's where the conflict of interest almost always lies, and it brings us right back to politics.
I think that Mr. Oransky is an outlier on the scale of journalism quality, as in 99th percentile. He is suggesting that science reporters actually do research on the topic of their reporting. I suggest that many if not most so-called science reporters have little to no knowledge of the science on which they are reporting. They report in sound-bites, and have no ability to critically evaluate what is being said.
Then again, readers of such reporting are oftentimes the lowest common denominator in their ability to understand much of anything, especially science.
Interesting. I thought at first that this was just going to be someone else pointing out that not all views are of equal worth. But Mr Oransky has actually offered some constructive advice - quite a rare thing!
Well he's certainly not mean...
In addition to Ivan's list, I'd also put forward the use of Twitter for consideration. This approach is entirely dependent on having useful and knowledgeable people following you (which in turn depends on your own participation). But if you amass such a legion, they can be incredibly valuable. I've crowdsourced sources for a couple of articles recently. For the autism story I covered last week, I put out a tweet asking for reputable people in the field. Within about 15 minutes, I had about 8 or so replies and a list of about 15 or so names. I still checked these people out, looking at their publication record, research interests and so on, as Ivan suggests, but it was a good headstart.
As an aspiring freelance science writer who does not have the advantage of corporate/university subscription for scientific journals, how can one read the full text of research articles rather than rely on press releases?
Easy. Ask the scientist. This works for me around 99% of the time.
An alternative, following on from my earlier comment, is to ask your Twitter network.
Totally agree. There was an article about a year ago in Skeptical Inquirer that address a similar question, namely "how to tell pseudoscientific claims". As I recall, anecdote and hedged claims and references to "scientific" studies without citations featured into it.
Two advantages that pseudoscience has over scientific journalism is the personalizing of the story, and there seems to be some evidence that humans are predisposed to understanding story more than, say, statistics; and the fact that it takes a lot less work and organization to write a bad article than one that is researched and cited.
Many thanks for all of the constructive and flattering comments. I'm glad the post has sparked some discussion -- just as I'd hoped, there are already several great ideas that add to it.
@Kate: Couldn't agree more that the opposing viewpoint idea has at least some of its roots in covering politics. Also couldn't agree more about following the money -- that's one of the first rules of journalism, and doing so leads to lots of great stories, investigative and otherwise.
@Rob: You're very kind. I'd give my colleagues, and our audience, more credit, though!
@Ed and @Deepa: Twitter is a great place to find sources, and to become one. As far as getting papers, asking the authors usually works. If it doesn't, try the journal or the authors' institution. Most are quite happy to help people who are going to write about their research.
In case any of the followers of this thread are interested, today (Friday) on Science Friday of National Public Radio, they were in San Diego at the meeting of the Nat Assn for the Advancement of Science talking about this very thing -- and also had some good points and advice. The link to the page is http://www.sciencefriday.com/program/archives/201002191
One difficulty for non-scientists in trying to understand and embrace new ideas and theories is that some in the scientific community speak with arrogance, and some speak with no communication skills. Both of these can be very off-putting and/or confusing to people who are genuinely interested in the work (read: potential investors!).
In the absence of good communicators, it falls to journalists to fill in the blanks, and that's not always an easy fit with time pressures most journalists live by these days.
The folks today on NPR really did a good job of addressing this today.
Ivan, I'm embarrassed to get to this so late but I really love the distinction you make about other viewpoints vs. false balance.
Your Reuters article on the Circulation paper is a superb example and I encourage students here to read it (link. Yes, not everyone knows Hal Koenig but I argue that it's also important to think critically about an "opposing viewpoint." Koenig has done an incredible job in studying religiosity and health but he gets criticized sometimes (ok, by some at ScienceBlogs) for being funded by The Templeton Foundation, thereby causing any of his opinions to be thrown out with the bathwater.
You knew how to include his caveats that reflected true limitations of the study design and alternative conclusions because they are scientifically valid. Taking into account the confounding variables of racial and ethnic groups, access to health care, etc. are real issues in interpreting the findings and he put them forth because he is a scientist, not a dogmatic religious zealot. Other less seasoned reporters might have sought a different religiosity scholar and quoted their objections without considering whether their scientifically validity.
This is my long way of saying that selecting reliable sources is certainly important and the list grows the longer you are in this business. But even early career journalists should have already cultivated a good enough bullshit detector to know when the opposing viewpoint is not founded in reason.
A great discussion - thank you for your perspective and many thanks to Ed for guest hosting you.
Good post that really ought to be taught.
One thing I thought I might add is that you always have to work hard to expand your list of trusted outside sources and talk to them not just to put in your articles, but also to just give you better background on the issues. A nice big rolodex is a reporter's friend--not everyone in it will rise to the status of your Reichman, but it's always good to have a rich candidate pool.
For writers on deadline, this can be very time-consuming, but it pays off.
Thank you for your considered article (and to Ed for hosting it). I find it very frustrating to see many scientific issues portrayed as two opposing sides, each of equal weight, when there is so often a scientific consensus. ("Teach the controversy", anyone?)
In the UK at the moment you can currently see some very clear - I won't say good - examples of this, with responses to a recent summary showing the lack of effect for homeopathy (above placebo, of course). Adherents often quote the number of studies they say support it, and there's so rarely the opportunity to point out the low number and quality of these compared to those showing a lack of effect.
Would it be okay to quote parts of your post in a worksheet I'm working on? I'm producing an activity for my students where they research online a science subject for debate and this could be a useful starting point...
No need to feel embarrassed Abel, I'm coming in to this much, much later! Here, let me take your embarrassment from you...
@7&8: If the paper has already been around, I'd just ask as Ed and Ivan suggested. Scientists do this too, often simply because their institution doesn't have a subscription to the particular journal in question. If you're trying report something that's "brand new", embargoes can effectively hide the article until after others have their shot at it. (Ed's more recent post on the post-embargo delay is part of the trouble here, but there's also that fact that not everyone has access to embargoes even if you are trying to write for print, etc.)
I agree it's important to seek the right kind of alternative views, ones that favour discussion of the substance of the work, not classic political-style debate. I think the idea of trying to look outside of the paper is a good one, even if it take more time, skill and some background. There is a tendency to, even if subconsciously, favour citing previous research that supports the viewpoint put forward by the researcher and casting a wider net can avoid that. (In an ideal world this wouldn't happen, but we live in a real world...)
I've written a little about related issues (as far too many people do!), particularly "Media thought: ask what is known, not the experts opinion" ( http://sciblogs.co.nz/code-for-life/2009/12/31/media-thought-ask-what-i… ) and "Science journalismâcritical analysis not debate" ( http://sciblogs.co.nz/code-for-life/2009/10/23/science-journalismâcriti… )
Many thanks for all of the additional responses. Glad this sparked a discussion.
@IanH: Apologies for only responding now to your request. By all means, please quote from the post, very flattering.