What does it mean to assess the credibility of science reporting? (And can we expect students to do so?)

I'm almost done grading a massive mound of papers by my freshman. There's the usual assortment of dismal writing, hilarious colloquialisms, and insight. It's been an exhausting task (minnow's teething hasn't helped), but also a useful one, because the papers have exposed the continuing misconceptions my students have about -ology.

The assignment was to select a recent news article that was relevant to -ology, summarize it, and analyze it. Specifically, I wanted them to put the article in context, offer an opinion on the topic, and assess the credibility of the piece. Most students had no problem offering opinions and almost everyone could place the article in some sort of context (global, personal, or within the class). But very few even attempted to discuss credibility, and of those who did, many simply said things like "[Magazine] is a credible source."

Maybe that's my fault. Maybe it's too much to ask of freshman. Or maybe I wasn't clear enough about what I expected. Maybe the students didn't know what credibility means. So the rest of this post is intended to be a guide for my current and future students. I would greatly appreciate any feedback my readers have for me.

What does "credibility" even mean? For the record, Merriam-Webster defines credibility as "the quality or power of inspiring belief." As far as science reporting, I think that credibile means that the science was reported accurately without distortion for political or social reasons.

Here are some questions to keep in mind when you read a science news article.

Where is the article published?
If an article is published in the "mainstream" media, it intrinsically has more credibility than something self-published by an individual or organization. That's because we understand that media organizations have internal standards to ensure at least minimal standards of accuracy and ethics are met. Newspapers and magazines hire professional journalists that have had formal writing training and they may have fact-checkers to help them out. This is not to say that individuals or advocacy groups can't publish factual articles, just that the underpinnings of bias- and error- prevention aren't always known to be there. Thus, a science article published in the New Yorker has higher credibility than an article published by Greenpeace.

Who wrote the article?
If the article is written by someone with a degree related to the general topic of research, they probably understood the science well enough to correctly report it. Most science reporters do not have science degrees, and some do excellent jobs, but without strong science backgrounds they may be reliant on press releases or fail to notice inaccuracies. There are also people and organizations that view the news as a tool to advance their political or social agenda, and they will selectively report or distort the science so that it supports their cause. Sometimes the people and organizations will be open about their agenda, but other times you may have to do a little digging to discover where the money is coming from. For example, the Greening Earth Society was created by the coal industry to cast doubt on the scientific consensus about global warming.

Who funded the research? Again, this comes back to whether the research and the resulting news story might be biased because the organization funding the research was pushing a particular agenda. Not all research funded by industries or non-governmental organizations is untrustworthy, but you need to evaluate the conclusions. For example, if research funded by the Fisheries Industry concludes that pregnant and breast-feeding women should eat more fish, despite mercury effects, I'd think twice before loading up on tuna. Unfortunately, in the general media, the funders name is often unreported, but sometimes with a little internet sleuthing you can find out.

Was the research published in the scientific literature? This is a really important thing to look for in the news report. Articles should have a sentence that says something like "The paper will be published in next week's issue of the Journal of -ology." If the research is published in a scientific journal, it means that three other scientists with the same specialty thought that the research was sound enough to be published and accepted as part of the body of scientific knowledge.

Were there corroborative sources in the article? Has more than one person seen the elusive ivory-billed woodpecker? Did the reporter get comments from researchers at other universities or government agencies? When other scientists can verify the results, or at least comment on their importance, that's an indication that the news is credible.

Where was the research done and who did it? Was the research done at a well-known university, medical institution, or government lab? If so, there were procedures in place to ensure the well-being of the research subjects. Also, the researchers had to be competent enough to be hired at the institution in the first place.

How does the reported science fit in with what you know about how the world works? While some scientific discoveries come out of the blue and revolutionize the way we think about the world, most science is incremental and builds on what was already known. Based on what you've learned in your classes, what you've read in other newspaper and magazine articles, and what you have observed by looking at the world around you, you can start to evaluate whether the reported science makes sense.

More like this

By email, following on the heels of my post about the Merck-commissioned, Elsevier-published fake journal Australasian Journal of Bone and Joint Medicine, a reader asked whether the Journal of American Physicians and Surgeons (JPandS) also counts as a fake journal. I have the distinct impression…
Illustration by David Parkins, Nature Today, Nature released a news feature by Geoff Brumfiel on the downturn in mainstream science media. We've all known that this is happening; the alarms become impossible to ignore when Peter Dysktra and his team at CNN lost their jobs last year. For mainstream…
Following up on an excellent post she wrote earlier this month, Jessica Palmer at Bioephemera brings us an update on the lawsuit against Jared Diamond and The New Yorker. You may recall that this lawsuit alleges that a story written by Diamond and published in The New Yorker defamed its subject (…
Over at BPR3, a reader brought up an interesting question about the nature of peer-reviewed research, which I thought was relevant to our readers here as well. I'm reposting my entire response below. The system of peer review, the bulwark of academic publishing, has served scholars for centuries.…

One thing I found helpful as both a student and an educator is to discuss the different levels of literature.
Primary is first publication such as a journal article
Secondary might be a text book or a general science magazine that reports directly from the primary literature
Tertiary is newspapers and magazines written for a general audience. These typically report from a secondary source making sound like they are reporting from the first.

Showing examples of each sort was a great help.

Also, if this was their first exercise doing this, see if they improve after reading your comments on their essays. Some of them might not understand what it is you want the first time, especially if they are Freshmen. Make sure you save examples, of good, bad and indifferent, for next time you teach the class too.

I have other ideas but this might give your students another perspective.

Yeah, I hate to say it, but I'm not suprised the freshman flopped on this. I've seen a significant number of 1st year masters students flop on a similar assignment. Ideas about credibility and audience seems to go right past them.

I am going to say that Freshman should be able to do it, but it requires 10 minutes of lecture before, discussing levels of literature and what credibility means.

As someone who's much closer to the freshmen than many of your readers, (I was a Freshman only 5 yrs ago) I would also agree with many of your comments. I think it's a great lesson, and your descriptions are really helpful.

One of the hardest things about teaching, especially freshmen, is figuring out where students start. At my R1, especially when you first enter the major classes for your major (for us this was junior year), there is often a big disconnect between the base level that professors assume students have and the level students are actually at. This often comes out in the assumption that just because someone has taken 4+ semesters of math they would remember all the concepts they learned in it and not need a refresher.

In the case of your lesson, when I was a freshman I had no idea about how any of academia or scientific publishing worked. I had to learn those things, gradually, as I got involved in lab research, and so most of that knowledge came after my sophomore year was completed. While a freshman might have something to say about the last point you mention (does it fit with the science they know), most of it would likely have been things they wouldn't have known about. If you gave them a handout of what you put in this post before asking for the assignment, you could expect much better results. Still, many of them wouldn't know yet how to do the sleuthing to find out who funded it - they probably aren't even comfortable finding a specific paper in their library's system yet, let alone looking through it for lab affiliations, funding, and the credibility of the journal.

I think that your expectations are probably to much for them. It kind of tells you where they are at, which tells you what to teach them to come up to your expectations. Usually, they will only rise to the level that you expect.
Ciao

Since you're trying to help students to incorporate these ideas into a written product, I wonder if it would help to not only spend a bit of time in class discussing how scientists determine credibility, but also provide a brief handout. With freshmen, in particular, who may not have the world's best note taking skills, I've found that providing a written reference is helpful.

By Plant girl (not verified) on 11 Oct 2007 #permalink

I think it's very important to teach ALL science students about the peer review process in addition to pointing them to the primary sources, including the names of credible journals. I'm something of an activist for the teaching of evolution, and I've found that many, many people -- some of whom you'd assume are educated -- know absolutely nothing about how new research is vetted and then accepted in the scientific community. It amazes me how many people think that anything published on the Internet, written by "Dr." so-and-so, is Science. The wildest claims can be written up in "sciency" language, and these poor credulous folks take it as Gospel. (And yes, that was a bad joke, since a lot of what you see is from religious fundamentalists.)

By Leigh Williams (not verified) on 11 Oct 2007 #permalink

And by the way, Tylenol and frozen teething rings always seemed to work (insofar as anything does) with my little ones. I had a teether with liquid in it that worked wonders -- the cold and the texture of it were just right.

This is SO important to teach your students. I'm sure the majority of freshman have no idea about any of this. I'm sure most Americans have no idea!

It would be cool to give them this handout with some discussion and then have them do the assignment again with a different news article.

By ecogeofemme (not verified) on 12 Oct 2007 #permalink

Just to make explicit something that Leigh implied: just because a publication is called "Journal of Blah" doesn't mean that it's peer-reviewed! (Or that the peer-review means a damn thing - Journal of Creation, anyone?)

Nice ideas on your handout, and one I have pointed my students toward on my blog. For the last 2 years I have been using a workbook to help my students learn to evaluate the source, credibility, and logical structure of psychology research, and secondary sources in particular. I think it might work in any -ology class. Here's the reference:

Bell, J. (2004). Evaluating psychological information. 4th ed. Boston: Allyn and Bacon Publishers.

ISBN is 0-205-43511-4

I know this is nitpicking, but since you did single out 'dismal writing' as one of three hallmarks of freshman essays, I am compelled to point out that 'Freshman' is singular and 'Freshmen' is plural.