Cultural confusion: white papers vs. peer review

One of the greatest shocks when I started working in industry was the realization that the peer-reviewed paper, the most valuable form of currency in the academic world, was valued so little.

In academics, there is a well-established reward system for getting your work published in peer-reviewed journals. Whether or you not get hired, get money to do research, get to keep your job (i.e. get tenure), all depend on depend on your ability to get papers written and accepted by your peers. (Community colleges are an exception to this, there it's your teaching abilities that matter, not your publications.) After years of academic indoctrination and soaking up academic values, it was pretty depressing, in some ways, to learn having peer-reviewed publications no longer mattered.

In the commercial software world, peer-review has no or little value. Many people don't know what it is, and if they do know, they don't care. The business people seem to think that scientific papers are written by ghost-writers and the software engineers think they're written by people from marketing.

You mean, a white paper and a peer-reviewed scientific publication aren't the same thing?

No! No! No!

Just to set the record straight, white papers are marketing publications that serve to explain the technology used in a product. Peer-reviewed publications are scientific articles that must be read and accepted by other scientists. Peer review is not a perfect system, but it does have meaning, at least to other scientists.

I also learned that there are some very good reasons why many companies don't bother with the peer-review system. Mostly, there aren't many incentives for software companies to publish peer-reviewed papers. These papers take an incredible amount of time to write and revise. For a business, the price of that time is often too high and the return on the investment is too low. A white paper, which doesn't require peer-review, can be written and distributed at a much lower cost and in much less time.

So, I'm quite happy that now, thanks to BPR3, if we blog about peer-reviewed research, we have some new icons that we can use to identify what's what.

If you are a reader, look for this icon to find out if a post is about peer-reviewed research.

Blogging on Peer-Reviewed Research

If you are a blogger, you can use this icon to let your readers know that you're writing about research work that was strong enough to withstand review.

The guidelines and icons are here.

More like this

Over at BPR3, a reader brought up an interesting question about the nature of peer-reviewed research, which I thought was relevant to our readers here as well. I'm reposting my entire response below. The system of peer review, the bulwark of academic publishing, has served scholars for centuries.…
Bloggers for Peer-Reviewed Research Reporting has announced a contest to design an icon to identify serious blog posts discussing peer-reviewed research. Anyone will be able to use the icon on their blog posts whenever the post is a serious commentary about a paper published in a peer-reviewed…
Garrett Lisi's Exceptional Approach to Everything: When Lisi published his physics paper, "An Exceptionally Simple Theory of Everything," to an online archive last year, it created a media buzz about his lifestyle and an onslaught of support and skepticism about his model. Although the verdict is…
tags: PubMed, PubMed Central, medical informatics, bioinformatics, finding scientific articles This three part series covers the problem of finding scientific articles, compares results from a few different methods, and presents instructions for the best method. A day in the life of an English…

Another "problem" with accepting peer-review in industry is the business secrecy. From my experience, it just seems incomprehensible for industry people to have anyone outside the company reading through internal unpublished research material to say if it has any value and future prospects. It's like a marketing suicide. The problem is that R&D in industry is completely "closed" and registered through patents (or abstract white papers), while academic research is completely "open" and registered through peer-reviewed papers. The reason is that their goals are different (economic profit vs pure knowledge).

dave X ... I couldn't find anything in the guidelines to prevent this, but when possible, I would recommend downloading the icon and serving it up yourself. That way if they change the location of the icon, it won't impact your posts.

I like this icon. Need to look at the guidelines, etc more carefully, but it reminds me of using the CC icon in some ways.

As a software product manager, I had the following philosophy about publications and getting the science/technology you were working on in the public domain.

1. Case studies. Ideally these would be done by your customers, and in a perfect world would actually be published work converted into a short summary, usually showcasing how software was used (as opposed to the paper where the software might be just one part of the study)

2. White papers. Require more work, but are also, at least for me, a way to highlight your philosophy and approach to either software design or scientific problems.

3. Peer-reviewed publications. I got to publish two papers while I was in product marketing. One was a review article, and the other was a peer reviewed article, and was pure science. The fact that it used our own software was somewhat of an afterthought. You need to do this to keep your internal scientists engaged and yes, it's an opportunity for marketing as well, but that is only after the fact.

4. Conferences. IMO, this is the best way to do scientific marketing. You can defend yourself, gain some credibility, etc etc. There are trade secrets and additional material that you don't want to share with others. That's fine. But doing science with them and achieving results, you better share them, especially as a scientific software vendor.

One has to be realistic. Your primary responsibility is software development, but to attract good talent, especially scientific talen, you can't close the door on publication. Which is why companies that encourage scientific publications as part of the goals of internal scientists tend to hold on to them. Those people also need to understand that publishing papers is not their primary responsibility. If that's all they want to do then a company is probably not the best place for them.

Thanks Deepak, Harris,

I'm not really complaining about company publication practices. I just think it's important for people to know that there is a difference between the kinds of publications and the criteria that are used for review.

Dave, Deepak,

The BPR3 site should be a bit quicker now.

Dave Munger, one of the people who initiated BPR3 asks that people use the code and not just the image, for two reasons.

First, we want to track and aggregate the posts that use it.

Second, we want it used for science articles only and want to prevent it from being used to promote pseudoscience - creationism, intelligent design, astrology, etc.

The problem is that R&D in industry is completely "closed" and registered through patents (or abstract white papers), while academic research is completely "open" and registered through peer-reviewed papers.

Note that the original claim referred specifically to the commercial software industry. It may well be true in that context (I have no idea) but there are large swathes of "industry" that publish extensively, and certainly know the difference between a white paper and a reviewed article.

Incidentally, one thing I've learned about from science blogs (especially from commenters) is this odd reverence some people have for peer review. On the scientific totem pole, talks at decent conferences are more prestigious than papers in the large majority of peer-reviewed journals -- one certainly wouldn't trade a Keystone talk for some stupid JBC paper.

Also incidentally, the whole point of patents is precisely that they're "open".

J: A large fraction of ScienceBloggers are in academics. Whether people agree with this system or not, the peer-review system is pretty important in the academic world. Without peer-reviewed publications, you cannot get a job, or tenure, or funding to do your work.

I agree about the differences between the kinds of companies. Biotech and pharmaceutical companies do publish more than software companies. I guess that's part of marketing. :-)

I guess Sandra already helped to clarify some points, namely: "Biotech and pharmaceutical companies do publish more than software companies." I work in a large pharmaceutical company on the discovery research end and it is stated clearly by our management that they expect two peer reviewed publications from each PhD Scientist within this organization (this goal factors into our yearly performance review!). Based on the last figures I saw most departments have achieved that goal last year! I also know that some biotech companies (I was there before going into Pharma) have significant financial incentives for scientists to publish their research. There is and will be of course always the conflict of interest for a for-profit entity to protect their turf (patents) versus their interst to share their findings with the rest of the (scientific) world.

Whether people agree with this system or not, the peer-review system is pretty important in the academic world. Without peer-reviewed publications, you cannot get a job, or tenure, or funding to do your work.

Absolutely. But what I meant was that many people seem to think that credibility depends overwhelmingly (in reality, not just in some abstract theory) on peer review. When Professor X presents unpublished results at a conference, and then one reads the paper a few months later, one doesn't normally think "Gee, this is much more believable now that three mysterious peers have approved it!" On the contrary, persuading an audience of hundreds or thousands of peers is probably a much tougher test than is getting buy-in from three people you had input into choosing.