You may be aware that, as of recently, one of my tasks at work is to monitor media coverage of PLoS ONE articles. This is necessary for our own archives and monthly/annual reports, but also so I could highlight some of the best media coverage on the everyONE blog for everyone to see. As PLoS ONE publishes a large number of articles every week, we presume that many of you would appreciate getting your attention drawn to that subset of articles that the media found most interesting.
So, for example, as I missed last week due to my trip to AAAS, I posted a two-week summary of media coverage this Monday. And that took far more time and effort (and some silent cursing) than one would expect. Why?
I don't think I am a slouch at googling stuff. Some people joke that the entire Internet passes through my brain before it goes to the final audience. After all, I have been monitoring the Web for mentions of 'PLoS' and 'Public Library of Science' on blogs, Twitter, FriendFeed, Facebook and elsewhere for a few years now. If I don't catch a mention within minutes of it being posted, you can bet one of my many online friends/followers/subscribers is bound to quickly let me know by e-mail or Direct Messaging somewhere. If someone says something nice about PLoS, I am quick to post a ThankYou note. If someone asks a question, I try to answer or to connect the person with the appropriate member of the PLoS staff. If someone is publicly musing about submitting a manuscript to one of our journals, I am right there to give encouragement. If someone makes a factual error, I gently correct it. It is very, very rare that I need to raise the Immense Online Armies because someone is wrong on the Internet ;-)
So, why is it difficult then to compile a collection of weekly media coverage? Let me walk you through the process....
First, as you probably already know, PLoS makes no distinction between Old and New media. We have bloggers on our press list who apply/sign-up in the same way and abide by the same rules as traditional journalists (and, unlike mainstream media, bloggers NEVER break embargos, not once in the past three years since we started adding bloggers to our press list). For the kind of coverage we prefer to see, we point bloggers to the ResearchBlogging.org criteria. In return, bloggers can send trackbacks to our articles, their work is showcased side-by-side with the traditional outlets in our weekly posts, they can be discovered via Google Blogsearch, Postgenomic and ResearchBlogging.org links directly from each article, and one blogger per month wins a t-shirt and special recognition.
So, I start with blog posts first. The first thing I do is take a look at ResearchBlogging.org. Those are the best of the best posts - not merely mentioning our articles, but adding analysis, commentary, critique, context and additional information. How do I find them? I just search the site for the phrase 'journal.pone'. That search brings up every single post that mentions a PLoS ONE article because that phrase is a part of every possible form of the URL of the article (including the shortest one, which includes just the DOI). If a post links to our article (and that is the only way to get aggregated on ResearchBlogging.org) I will find it this way. Needless to say, this process takes just a few minutes per week.
Knowing that there are some good blogs out there that are not registered at ResearchBlogging.org (which is strange and unfathomable why - RB.org is a 'stamp-of-approval' place for science blogs recognized by the outside world of journals and media, as well as a nice way to get extra recognition and traffic, and even awards), I then repeat the same search - for 'journal.pone' - on Google Blogsearch. This may bring up a few more posts that I did not catch yet. Occasionally, some of these are good. Another couple of minutes. Blogs are now done. Move on to traditional media....
And this is where the Hell starts. Try searching Google News for 'journal.pone'...?! All I get are a couple of prominent blogs that I have already counted, e.g., those blogs that are listed by Google News (scienceblogs.com blogs, Ars Technica, Wired blogs, etc.). Where are the others?
The problem is, nobody in the mainstream media links to papers.
So I have to search for PLoS and for Public Library Of Science and then figure out which ones are covering specifically PLoS ONE articles (sometimes they don't specify, sometimes they name the wrong journal - last week an article on PLoS Current-Influenza was reported to be in PLoS ONE by a number of outlets copying the error from each other). Then I have to search for keywords for individual articles I suspect may have received some coverage. Last week, for example, I searched for "swallows+antioxidants" and "St. Birgitta", among many others. This lasts for hours! And at the end I am still not 100% sure I caught everything. How frustrating!
Not just is there a big difference in time and effort spent between finding blog posts and finding media articles, but there is an even bigger disparity when one considers what results come out of these searches. I have been doing this for a month now. I expected that there would be poor blog posts and poor media articles, that there would be good blog posts and good media articles, and that there would occasionally be some excellent blog posts and excellent media articles. So far, that is true.... except I have yet to discover an excellent media article. As a rule, the very best coverage of every paper in the past month was done by a blogger or two or three. Then there are some other, good pieces of coverage in both the New and Old media, and then there are some really bad pieces in both realms as well (not all blog posts I count here are really bad - they may just be too detailed, technical and dry for lay audience because the blogger is intentionally targeting scientific peers as audience, which is fair thing to acknowledge).
So, every week, it takes me a few minutes to find the very best coverage (which is on blogs, usually those aggregated on ResearchBlogging.org). And then I spend hours looking for remnants, in the traditional media, which turn out to be so-so, some OK, some not so good, some horrible. If I wasn't paid to do this, I would not do it - it cannot be good for my long-term mental health.
The resistance to post links is an atavism, a remnant of an old age before the Web. I know (because I asked many times) many good science journalists keep trying to add links, but the editors say No. The traditional media has still not caught on to the Ethic of the Link, which is an essential aspect of ethics of online communication.
I can think, off the top of my head, of three good reasons why everyone who publishes online should include a link to the scientific paper described in the article (just post the DOI link that comes with the press release if you are on the press list - if it does not resolve immediately, it is not your fault, you can always blame the journals for being slow on it - though this should never happen with PLoS articles):
Reason One: I will not go crazy every week. I am assuming that every scientific publisher has people on the staff whose task is to monitor media coverage and each one of these people is cussing and cursing YOU, the Media, every day. Try to make friends with people who provide you with source material on a regular basis.
Reason Two: Media coverage is one of the many elements of article-level metrics. Furthermore, links from the media affect the number of views and downloads of the article, and those are also elements of article-level metrics. Number of views/downloads then, in the future, affects the number of citations the work gets which is also and element of article-level metrics. Thus omitting the link skewes the ability of readers and observers to evaluate the papers properly.
The current ecosystem of science communication has a scientific paper at its core, additions to the paper (e.g., notes, comments and ratings, as well as Supplemental materials, videos posted on Scivee.tv, etc) as a shell, and incoming and outgoing links - trackbacks, cited papers, citing papers, links to other papers in the same Collection, links to other papers with the same keywords, and yes, incoming links from the media - as connections building a network: the entire inter-connected ecosystem of scientific knowledge.
By not linking to scientific papers, traditional media is keeping itself outside of the entire ecosystem of empirical knowledge. By doing this, the traditional media is fast making itself irrelevant.
Reason Three: if an article in the media discusses a scientific study, that scientific paper is the source material for the article. If the link is missing, this is an automatic red flag for the readers. What is the journalist hiding? Why is the article making it difficult for readers to fact-check the journalist? Something does not smell good if the link is not provided (or worse, it is impossible to figure out even who are the authors and in which journal did they publish - yes, that is more common than you think).
The instant and automatic response of the readers is mistrust. Every time you fail to link to the paper, you further erode whatever trust and reputation you still may have with the audience. You soon cease to be a legitimate source of information. Sure, most readers will not go hunting for the paper to read it in order to fact-check you. But two or three will, and they will let everyone else know if your article is trustworthy or not, either in the comments under the article on your own site, or on their blogs which will be quickly picked up by Google (remember: Google loves blogs).
So please, media types, hurry up and catch up with the world. The 21st century is already a decade in - you really need to do some very fast learning. Right now. Or you'll go extinct in a nanosecond. And despite my reputation, I never said that I'd consider that result to be a Good Thing. We are in this together, you just need to do your part. To begin with, start linking.
- Log in to post comments
Excellent post, Bora --
I think this can be extended to include links to any researchers who are quoted in an article. Let the readers see their qualifications easily.
Access is an issue that influences linking. The majority of my readers have trouble accessing articles, because the majority of articles are in restricted-access journals -- even I can't get some of them. I can understand why a media outlet would hesitate to include links that very few of their readers will be able to use.
Of course, that doesn't apply to PLoS journals, which is a tremendous vote in their favor. It's also a reason why the current NIH open access requirement doesn't go far enough -- most people who would like to read an article won't return or remember six months later.
agreed, it's a pain having to look up the article because they don't mention the name of the paper, usually just the journal and author. i hadn't even bothered to consider that this practice was a holdover from the days of print, it just seemed obstinate and inconsiderate.
Agreed. Good post, B. I know you
mentioned this but don't underestimate this as a reason why MSM doesn't link. When the articles are
written, the paper isn't online. The DOI solution you (and I) propose is the best workaround but it does piss off readers. There need to be some larger systemic
fixes before article linking becomes standard practice.
Also links need to be standard in MSM op/eds too. As you say, it's a question of trust and legitimacy. I was struck recently, when reading an op/ed on journalism in Nature, that the complete lack of supporting links made the piece less valid in my eyes.
Agreed. Good post, B. I know you
mentioned this but don't underestimate this as a reason why MSM doesn't link. When the articles are
written, the paper isn't online. The DOI solution you (and I) propose is the best workaround but it does piss off readers. There need to be some larger systemic
fixes before article linking becomes standard practice.
Also links need to be standard in MSM op/eds too. As you say, it's a question of trust and legitimacy. I was struck recently, when reading an op/ed on journalism in Nature, that the complete lack of supporting links made the piece less valid in my eyes.
Totally agree about linking to original sources. One of the worst offenders is, sad to say, the BBC (of which I am a strong defender usually) Too may of their science pieces are anonymous too. I made this point at almost the same time as you (in the process of making a complaint, at http://www.dcscience.net/?p=2813 )
Good point. Although I think a) some papers are written in a manner that is not easily accessible to the casual reader, so they just cover somebody elses coverage without checking the original and b) a lot of writers, especially freelance ones, might not have fully paid up access to a large body of scientific journals and may again rely on secondary sources.
From then on it becomes like Chinese whispers. I suspect few reporters actually read even easily available reports from end to end, simply because many are quite long and involved.
Cite the original work? Duh.
Hi Bora,
Excellent points. So actually I have a really ignorant question for you... On the science podcast site, we always link to the studies that we talk about, or at least the abstracts for closed-access papers. But does the format of the link matter to its searchability? We just make the words "The study," clickable, but don't show the DOI or long ugly link on our page. Does it make any difference?
Elsa
@John Hawks: Of course, more links the better. We in the blogosphere have always intuitively understood that. Having plenty of links gives immediate perception of reliability. Clicking on links and seeing that the author has actually interpreted them correctly, boosts the trust up from perception to reality.
But since the media is so averse to linking, I am suggesting in this post to start small - just add that one key link: the paper itself. Hopefully that one step would show them that linking out is a good thing, increases their reputation, and then they would start linking to other stuff as well.
Access is an issue. But as I noted in the post, it only takes one or two people (those who have access) to read the source material to report to the others if the journalist has done a good job or not. Trustworthy members of the community can evaluate the trustworthiness of the journalist for the others.
@razib: it is both. Because the time of the print was the Golden Age (as they were gatekeepers) they are now obstinate to change their ways, as each step towards transparency puts them in danger of having their errors (and it is human to err, but journalists do not buy that - they have built their aura of absolute omniscience before the Web could check them) exposed. They are not having any of that!
@Ed Yong and @Ed Yong: The DOI solution is a technical solution which, in theory, should work. But it places the onus on journal publishers and their IT teams and thus journalists can wash their hands and say "not our fault" and appear like saints (and they like to appear like saints). This is also a chicken-and-egg question. Since DOI resolving often does not work, media has the (easy and false) excuse not to link. Since media never links, the techies who should be fixing the DOI resolution problem are in no rush to do it - no incentive.
@Laura: it is important to distinguish between beat reporters who are given assignments on tight deadlines to cover an occasional science story on one hand, and on the other hand the specialized science journalists who are often freelancers and often excellent. I'd say, from my knowledge of the people in the business, that freelancers do a better job than newsroom folks who are contantly under pressures of time and multiple assignments. Freelancers are usually on press lists of journals so they have access to most of the stuff. If not, they are REALLY good at bugging authors and/or journals and/or online buddies, to get a copy. Freelance science journalists, unlike their always-in-a-rush newsroom colleagues, actually tend to read the literature carefully.
@Pascale: yes, it should be Duh. Not in their world, unfortunately. Read the link under the text "essential aspect of ethics of online communication."
@elsa: I don't think it matters much. If the link, in whichever form, brings the reader to the paper (OA or TA, then hopefully some will be able to access it), I think the exact form does not matter. I am not sure how it is for other journals, but every possible link to PLoS ONE articles always contains the string 'journal.pone'. I suspect that this is the case for most journals.
Remember, Bora, that a lot of time the problem is less the journalist than the journal. I file stories with links and blogrolls -- but just try to get a publication to run those links and rolls, even with the online version of the story. Frustrating and crazy-making. Also inexplicable, yet there it is. Save us, Bora.
Yes, I noted in the post that good journalists keep trying to supply links, but the editors (or whoever is the boss who decides these things in the media) refuse. Keep asking, until they start relenting. Or send them this post ;-)
This is an excellent point. Both parties should just pull their fingers out and get on with making it better.
I agree they should get on with the technical solution. To elaborate on what I wrote in my article (there's a link on my name for the source!)
What I was thinking was that if people are using relational databases (and I really, really cannot imagine them not), it shouldn't be hard to code it so that each article has a release date & time associated with it. If a request for the article comes in prior to this date/time, the server responds by throwing up a standard not available page (or a ânot such articleâ if they want to hide it). If the request falls even a second after the release date, the reader gets to see it. Seems pretty obvious to me, at least the server aspect of it.
Reading quoted portion again, what an ugly use of a double negative (and incorrectly at that, 'is' should be 'isn't'). No! No!! One thing I can't stand and about writing blog posts is that I haven't time to put the articles aside for a bit before giving them a once-over with the up-shot that my blog writing is far below my own standards. Ugh. Double ugh. Excuse the rant, it's an issue that's increasingly frustrating me.
('no such' for 'not such')
Yeah! This is really annoying. My local newspaper ran the cannibal bonobos story yesterday. Of course, I wanted to know more about it. But not only didn't they link (they never do), they didn't write the title of the original article, nor the name of the magazine/journal. Nor the names of the researchers. Nor the name of the university. All we were said was that German researchers saw it.
Yeah, at least we know that.
Of course, going to John Hawks' blog, I found it within seconds. But still, it's disappointing. And it's not the first time. I'm making the stats up, but I'd say that 90% of their science "news" are directly taken from professional press agencies, without editing, which, in turn, use the press release and make it even shorter and less informative.
And after that, I'm bound to have arguments with people around me because most of the time, I track the original and, of course, have to disagree with the way it's been presented. And, of course, everybody blames me for thinking I know better than they do.
My life is ruined because of science journalists (or their bosses)!!
Of course, this doesn't apply to people like Ed Yong who does a wonderful job. Problem is, I don't think the people who read newspapers articles are the same than those who read his blog or the likes. I'm subscribing to the ScienceNow page on Facebook and, god, are most of the comments boring or plain stupid.
Newspapers like to be consistent. If they add lists of cited references to some science articles, then readers will demand references for all articles in the paper and how could a newspaper justify refusing those reasonable requests? Then newspapers will become equivalent to scholarly journals and be priced accordingly. Every article will have a $49 price and a "Purchase Now" link to your shopping cart. It'll be worth it because you'll know they are not hiding anything.
"Every article will have a $49 price and a "Purchase Now" link to your shopping cart." Not if they are all Open Access. As they should be. As they will be sooner or later. That is another element in the chicken-and-egg game: more there are papers that are OA, more easily it will be to persuade MSM to link to them; more the MSM implements links, more the journals will feel the pressure to make their papers OA.
Here's a good example of this.
This media report from Thursday has attracted over 12,000 comments so far:- http://news.yahoo.com/s/nm/20100304/sc_nm/us_dinosaurs_asteroid
but there is no link to the AAAS Manuscript itself:- http://www.sciencemag.org/cgi/search?src=hw&site_area=sci&fulltext=schu… (requires free registration to access the PDF in question).
Why oh why is there no direct link?
My last comment did not appear here. Please however see this thread on FriendFeed:- http://ff.im/h3wvV
This is discussing a recent media report with over 12,500 comments thus far, but there is no link to the AAAS Manuscript in question. The Manu is almost OA as you do need to sign in with a free registration to access it.
--
I also wanted to say that last week, I was speaking to BBC Scotland's chief health correspondent who I have known for several years. When I mentioned PMC (PubMed Central) I was advised that they hadn't even heard about PubMed until I mentioned this resource to them a few years ago.
In this instance, the message is getting through, but very slowly....
I agree wholeheartedly. Since study details are rarely disclosed in journalism, linking to source material is a simple, clear way to build credibility and allow those readers who are interested to delve deeper into the research.
Research can be a very secretive process though, its the openess of a document and its resources that really drives a wider audience to push a subject. The inclusion of links to associated subjects can take any subject to a whole new level.
One aspect that can be confusing though is moving from research link to research link and then not documenting the path, in fact I know of some medical researchers who start with the initial resource material and follow the trail from one link to another, when they have finished they could save a lot of time if there was a cronology of resources compiled with bullets. It helps then if you need to to expand the subject.