Peer review for blogs?

Amardeep Singh suggests that bloggers might benefit from some form of peer review:

The idea came to me as I've begun preparing a tenure file at my current university, acutely aware that my blog writing cannot be considered "peer-reviewed" publication by any current standard. Even the rewards of occasional Boing-Boing-ish popularity (my post on "Early Bengali Science Fiction" from awhile ago, for instance) do not help, since that is really popularity rather than review. But why not institute a review of some sort?

But how to go about it? Getting scholars to review academic work is often like pulling teeth: They all know it needs to be done, they know it's important, but they're going to put it off as long as possible. Singh thinks he has an answer:

My idea is to have a system of academic blog reviewing, where people self-select individual blog posts they've written for review by others, perhaps using a combination of Technorati tags and emailed links. The reviewers could consist of fellow bloggers (credentials no bar) as well as non-blogging academics in a given discipline, who would publish their reviews on a central site. The reviewers could choose to be "onymous" or pseudonymous (as long as it is a consistent pseudonym, and contact information is available to site admins), and be asked to write a significant evaluation to the post in question (say, 250 words). Other reviewers and readers of the reviews could also evaluate the reviewers' comments, as a way of maintaining standards for reviewers. Troll-like, unfair reviews would be deleted, and their authors denied reviewing privileges.

He envisions this site as sort of an upscale Digg, where substantive criticism is preferred to "cool" or "not cool" rankings. I can imagine something like this working, but still I wonder if it might feel a bit too much like open peer review: unglamorous, and unrewarded. But what if participation in the peer review community were as valued as "publishing" there? While some bloggers may prefer to write and submit substantive posts, others may develop a knack for evaluating those posts, and if there were also a way to evaluate the evaluators, these reviewers might also be able to submit their work as part of a tenure file.

As I say in my open peer review post, all this is going to require a change in the way academic rewards are meted out, and changing that system is always slow going. But perhaps as more people like Singh make their way into the upper echelons of academe, we'll see something like Singh's vision implemented.

One other note: a commenter mentioned the ScienceBlogging anthology as an example of peer-review of blog posts in action. Maybe other disciplines should consider something similar.

(via Peer to Peer)

Tags

More like this

Well, PalMD and I have been working tirelessly on putting together a plan of discussion for the upcoming ScienceOnline'09 session on Anonymity and Pseudonymity - Building Reputation Online. Over the last several months, we have had a tremendous outpouring of comments on our own blogs and numerous…
Our new Scienceblogs overlords sure have great timing with their new pseudonymous blogging rules. For those who haven't run across that yet, National Geographic has decided to eliminate pseudonyms and force everyone with a blog remaining here (which is already dwindling) to blog under their real…
I'll confess that I am not one who spends much time reading the reviews of books posted on the websites of online booksellers. By the time I'm within a click of those reviews, I pretty much know what I want. However, a lot of people find them helpful, and the ability to post your own review of a…
I've talked about a number of these issues before, but since Abel and PalMD are having some conversations (here, here, here, here, here, and here) in preparation for their session at ScienceOnline09, and since I've experienced the blogosphere on both sides of the pseudonymous line, I thought I'd…

I'd love to support such a project, and I would like to have it connected to more usual paper-rating site, like Google Scholar. One way to generate some incentive is to award kudo-points to good reviews. However, such a system has to rely on existing hierarchies, and might face a delicate challenge in trying to favor inter-disciplinary attempts.

I think its a great idea to have some peer review, but I think it would be quite difficult to "review the reviews," that is, separate the quality criticism from the bad, especially when the final editor of the post would still be the same (Dave). I feel like Cognitive Daily already has an edge up over sites like Digg, because most of the comments received are from more dedicated and knowledgeable readers, who usually put out an adequate criticism of the post's quality. Other readers can see these comments, so I think we already have a decent peer-review system in place.

Peer review? Simple fact-checking would be a coup. The stuff some bloggers write - without bothering to do any checking at all. Including some on Science Blogs. And they let comments that are just plain factually wrong go by...for the reasonably well-educated, blog content is at best buyer beware. If bloggers think they have no responsibility for factual accuracy, then they should label all content as such: completely possible that nothing in this entry has any factual basis whatsoever and everything here may be entirely wrong. Because I couldn't be bothered to check. And by the way - the comments are equally suspect if not more so.
Assume error unless proven otherwise.

What compounds this carelessness is the conceit of some bloggers that they are journalists or that the blogosphere is tantamount to traditional press. True, traditional journalists make errors. But at least they have a standard of accuracy and they try to fact-check. Their incidence of error is orders of magnitude less than the rate seen even on blogs written by the well-educated writers here on SB.

SEEDs editors ought to be concerned about having their reputation colored by some of the stuff that appears on the blogs they sponsor here.

Flame away.

Ellen,

Your points are very well taken. You're absolutely right that many blogs aren't fact-checked. It's well-known that blogs are often unreliable sources of information. But I'd submit that's a good argument for peer review of posts -- then at least we'd have some sense of when a particular blog post is well-thought.

Fact-checking and monitoring of comments is more difficult. Comments are a two-way street -- often commenters identify problems in my posts that I missed. And commenters also correct each other. Further, presumably readers are aware that commenters aren't necessarily experts, even when they claim to be.

If I intervened every time I felt a comment was off-base, I'd spend more time policing the comments than I do writing posts. I can't speak for the other blogs, but I do try to moderate comments for inappropriate language, and I try to respond to very inaccurate comments when other commenters don't step up.

You're right that Seed is gambling by sponsoring a site that isn't quite to the level of "real" journalism. I think they're betting that their readers are smart enough to know the difference between a well-thought post by an expert in her field and an offhand comment by an overly ambitious blogger.

Cognitive Daily attempts to maintain standards that are close to the level of mainstream media (and we think our standards exceed those of most media outlets when we're reporting on peer-reviewed research), but please let us know if you believe this blog in particular isn't living up to those ideals.

Thanks for the link. It's interesting to find that my post has been picked up more by science bloggers such as yourself than by people in literary studies!

One thing that really interests me here is the "Scienceblogging Anthology," which is a limited project. Having a specific, finite goal seems like a great way to get around the problem of lack of motivation for reviewers in the "open peer review" process.

Ellen: Blogs need not be marked 'unsafe' for the same reason the traditional press need not be marked 'unsafe', nor should peer-reviewed publications be marked 'safe'. In fact, you nailed it in your response: it's a case of "buyer beware."

As Dave suggests, blogs are simply another form of communication; the page is not static. It allows for affirmation, rebuttal, or opinion from readers. And, like the best 'proper' scientific forums, even the input of a respected scientist is formed by opinion or, sadly more often, one-upmanship.

Anybody doing serious work should be following up on all links back to peer-reviewed publications, if they're even reading science-oriented blogs at all. Scientists are trained to be sceptical, right? Then the same applies online. Critical thinking is required to get the most out of any paper. Likewise for blogs, TV, newspapers, radio, word-of-mouth, etc, which are all mediums attempting to "bitesize" years of hard labour.

Joe Average might be reading to learn something of interest. It is generally quite clear which blogs are good quality and which are not, in the same way that tabloids don't appear as reputable as the broadsheets. The assumption that one is more accurate than the other is often incorrect, but it is, you'll agree, a decent heuristic to play by.

The idea is interesting. It should be as similar-to-commenting as possible, for example, you are logged in a service similar to cocomment.com. let us call it coreview.com for now.
coreview.com modifies the form in which you post a comment to a blog entry (in the same way as cocomment does) by adding the few fields that we decided are needed for the review: few fields about "how relevant this is" (scale 1-10), "how well written this is" (scale 1-10), "which scientific topics is this about" (insert up to 5 tag). Then the comment/review you write is automatically aggregated by coreview.com where the usual aggregations can be shown: "best post in last 24 hours", "best posts per topic", ...
One of the big problem I see is: "all the reviews are going to be very very very positive. there is no point for me to spend time writing a bad review of someone, especially if that person is an "authority" and especially if I do it with my commonly known pseudonym.
Anyway I'm very interested in the idea...

"all the reviews are going to be very very very positive. there is no point for me to spend time writing a bad review of someone, especially if that person is an "authority" and especially if I do it with my commonly known pseudonym.
Anyway I'm very interested in the idea...

I think the point is that non-authorities could nominate their own posts and have an opportunity to become seen as "authorities" -- the reviews would allow others to separate the whackos from unrecognized talent.