Would limiting career publication number revamp scientific publishing?

That's the thrust of an interesting editorial in Nature Medicine: what would you do if you could publish only 20 papers throughout your career? And how would it affect research productivity, scientific publishing, tenure review, and a host of other issues? More after the jump...

The editorial suggests this model for publishing:

These are the basic rules: whenever you get your first academic job (that is, the first lab of your own), you get 20 tickets. Every time you publish a paper, you hand over one of them. Once you run out of tickets, your publishing days are over. As simple as that.

If we adopted this model, many articles reporting incremental advances would no longer be written, and many specialized journals would disappear. And with far fewer papers to read, each one reporting a much more complete piece of research, search committees or funding bodies could directly evaluate the work of a given scientist, instead of (as is often the case) leaning on surrogate indicators such as a journal's impact factor or number of citations.

They mention, of course, issues with this--would "seminal" papers be exempt, for example? What about review articles? Obviously this isn't something that could be implemented anytime in the near future, but it nicely folds together discussions of access to scientific findings and issues swirling around "publish or perish" (leading to publication, as they mention, of the "minimal publishable unit' of data for a paper). While the idea is interesting, they don't mention a big con I see: getting quick access to new research findings in one's field. While the minimal publishable unit may result in smaller papers with lesser individual impact, it also allows others to more quickly learn about ongoing results, rather than waiting 4 or 5 years for the publication of a huge, all-inclusive paper. In this day and age, would it even be possible to wait for this type of thing? If some maximum publication number was adopted, would that lead scientists to release their data as they analyzed it, rather than save all of it up for a big publication?

There's a lot of agreement that the current system has lots of problems--but it seems that something like this is just as open (if not more so!) to abuse.

Tags

More like this

Via Twitter, Daniel Lemire has a mini-manifesto advocating "social media" alternatives for academic publishing, citing "disastrous consequences" of the "filter-then-publish" model in use by traditional journals. The problem is, as with most such things, I'm not convinced that social media style…
Predatory open access journals seem to be a hot topic these days. In fact, there seems to be kind of a moral panic surrounding them. I would like to counter the admittedly shocking and scary stories around that moral panic by pointing out that perhaps we shouldn't be worrying so much about a fairly…
I'm doing a presentation at this week's Ontario Library Association Super Conference on a case study of my Canadian War on Science work from an altmetrics perspective. In other words, looking at non-traditional ways of evaluating the scholarly and "real world" impact of a piece of research. Of…
One of the more interesting "problems" in Science 2.0 is the lack of commenting on online articles. In particular some journals now allow one to post comments about papers published in the journal. As this friendfeed conversation asks: Why people do not comment online articles? What is wrong with…

First author only? Or any co-authorship?
If the former, the solution is obvious, and what is largely done - pre-tenure track first authors on most papers from collaboration groups.
If the latter, then you collapse almost all significant collaborations and people switch to single author, or very few co-author papers.

Now if we got 100 or even 200 tickets, then it might be realistic.

Maybe if you publish a paper that really knocks everyone's socks off they give you a ticket back. That would further promote the productivity of really great scientists.

This system would change a lot. The number of journals would have to drop, the number of papers in review would be much smaller (so a smaller reviewing burden for researchers). I think it would increase the risk and reward for scooping a subject.

Scooping would be a big problem. Imagine if you were working for 4-5 years on a piece of research, avoiding attempts to publish smaller chunks of work in preparation for the big one, and someone gets there first. Surely this idea would only magnify the already devastating consequences of getting scooped now.

Isn't there a risk that this could prompt scientists to be more guarded about their work and work against the type of chatty collaboration that fuels much of modern science?

Silliest idea I've ever heard of, as far as research is concerned, and obviously not informed by relevant data. Dean Simonton is the go-to guy on the study of genius, scientific and artistic creativity and productivity, and related areas. His *Creativity in Science* is a good, short reference / summary.

The most eminent researchers are also the most prolific: sheer number of publications is the best single predictor of the greatness of a person's work. However, their "hit rate" is the same as everyone else's, on average. So, it's mostly by pumping out a lot of stuff that they manage to hit so many good ideas. Limiting publication number would straitjacket these eminent researchers -- and as elsewhere, it's mostly the top who are producing most of best work.

People would just move "publications" to conference proceedings and preprint servers and lab websites and university published journals and anything else that doesn't count towards the limit. The net effect wouldn't limit the number of publications it would just fracture the how and where of publishing.

I wonder what is the average number of papers published over a PI's career. I'm not sure how 20 compares.

By Herb West (not verified) on 08 Oct 2007 #permalink

This proposal would not work realistically. Granted, there are some people who publish a LOT of rather worthless things. Grouping 10 of those papers into one might make for a useful publication. But, other people are simply more productive, and they can publish 10 really good papers in the same time period. Some subjects lend themselves to more frequent publications than others. Ten papers in one field are not the same as ten papers in another field. Maybe if there were more pressure on peer reviewed journals to just not accept incomplete research results, then fewer papers would be published. But, the real problem is in measuring research productivity primarily through number of publications. That has always seemed silly to me. Another measure of productivity needs to be developed for promotion and tenure consideration.

They're taking a simplistic look at publication. Publication isn't only a metric how good a scientist is. It's also a way to distribute information.

I tend to think I have more than 20 papers of useful information in me.

Every complex problem has at least one simple solution, and it is wrong.

If the quality of the submitted articles is weak, then design a system that rewards higher quality work.

If the work load is killing the people in the publishing industry, find a way to harness tecnhology to make the information more accessible to those who need it.

If people are having trouble keeping up with (knowing about new research, reading the thinkgs that are most relevant to them, etc,) then develop a discipline that is devoted to reading, categorizing and standardizing knowledge management so that those who need to know can easily find what they need to know.

Marc Andreeson (tech entreprenour, Ning, co-founder, Netscape browser fame) on his blog.pmarca.com blog wrote about high productivity and high quality productivity here
http://blog.pmarca.com/2007/08/index.html
and the research he was reviewing indicated that the most useful ideas came from those who produced the most, that there is a ratio of useful to mundane that seems to be constant and to limit the brightest and best in their output would have an adverse effect on the total body of knowledge.

Publicatons for scientific information ought not to be driven by emotional reactions and prejudices. They should operate on the same "gather the data and test the hypothesis" life style the authors live.

Just my $.02.

Ironically this same data overload issue is addressed elsewhere in the blog sphere today, from a different perspective.

The image I took away from the article linked below is that data/knowledge is expanding at such a rate that soon we will have no data because there will be so much that it can't be organized and made useful from the sheer volumn of it.
*****
Addressing this cogently, Rick Luce, the University Librarian at Emory Univ., and formerly at the Los Alamos National Laboratory, said in a recent National Science Foundation report on Infrastructures for Cyberscholarship [pdf]:

Typically, when hypertext browsing is used to follow links manually for subject headings, thesauri, textual concepts and categories, the user can only traverse a small portion of a large knowledge space. To manage and utilize the potentially rich and complex nodes and connections in a large knowledge system such as the distributed web, system-aided reasoning methods would be useful to suggest relevant knowledge intelligently to the user.
As our systems grow more sophisticated, we will see applications that support not just links between authors and papers but relationships between users, data and information repositories, and communities. What is required is a mechanism to support these relationships that leads to information exchange, adaptation and recombination - which, in itself, will constitute a new type of data repository. A new generation of information retrieval tools and applications are being designed that will support self-organizing knowledge on distributed networks driven by human interaction.

http://radar.oreilly.com/archives/2007/10/sensing_wars.html

The full article (which was generated from the recent Israeli attacks on Syria, read the entire blog post; the connections become clear about 3/4 of the way through what starts out to be a military concern and shifts into the concern about what do we know and how can we be sure and how can we access it) cogently raises the questions about the integrity of the human knowledge base if powerful interests are allowed to solve the problem of knowledge management in private instead of in full sunlight.

My god is that a terrible idea. Imagine the years of wasted effort and self-consored data that may have turned out to be important in future but which wasn't deemed worthy of spending a ticket.

If you are a crap scientist or leaving science could you trade in your tickets in some sort of carbon market like arrangement?

I can only imagine that this will stifle creative and hard working people. Limiting the prolifation of journals might be a better way to do it. But then again I don't understand the LPU thinking that the salami slicers are engaged in. I'd rather put all of the data in one paper and shoot for a really good journal.

What I'd do? Reserve five tickets to myself, then auction off the remaining fifteen at very high price to tenure-track people desperate to pad their resumes. After all, if just one more real publication can make your career it's got to be worth quite a bit - how would six months salary strike you?

And the five tickets? Enough to establish my credibility as a scientist, then go into an industry career, well padded by the nest egg from the remaining five tickets.

What would happen to graduate students? Most departments require at least one publication as first author. PI is on that paper. Limit PIs to just 20 papers, there will have to be a great reduction in the number of graduate students or a significiant increase in the number of faculty or the requirements will have to change. The latter is really not a good idea since graduate students should learn how to write scientific articles. The best way is actually write one that matters and go throught the process. Ideally you should do this multiple times in grad school to really get a lay of the land. The reduction in the number of grad students would have to be drastic that it would be more harmful than good. The increase in faculty would require more funding and why would there be more funding for less output? As you note incremental does put data into the scientific community to digest and discuss which in theory can lead to others working on the material, coming up with new ideas, etc. which can lead to those big advances.

Ponderingfool: you forget the easiest fix: don't require the PI's name on every paper. After all, in many countries the head of the lab is not automagically added just because they're the ones who run the project, but only those people who actually contributed directly to the specific paper (and giving it a quick glance just before sending it off doesn't count).

Fish got to swim.
Birds got to fly.
Scientists got to publish.

Seriously, as much work as it is and as much crap gets published, we wouldn't go into science if we didn't love to discover things and then talk about what we did. Right now publications are the primary means for disseminating our discoveries, but if publications were limited, we'd just find other ways. It's what we do.

Having to develop a mental filter for bad work is the price we pay for the ability to publish our own (possibly bad) work, and it's a price I, for one, am more than willing to pay.

Like carbon trading and similar schemes, the right to market these 20 tickets would serve to increase benefical outcomes. The prolific could purchase additional tickets from those who are not. Of course, one could only be prolific if the benefit from obtaining the additional ticket exceeded the cost. In other words, the paper you wish to write better be worth it.

Its a brilliant way to separate the wheat from the chaff. Your reputation would then drive the price you pay for tickets. If your peers were tired of reviewing insubstantial papers they would charge you ever increasing amounts for additional tickets in the hopes that you would stop.

Those who tire of their field of study, miss tenure or are otherwise unproductive could cash in their remaining tickets and live on a beach for a couple of years before going on to do something else. Thus decluttering the field and of course, raising the wages of the remaining, scarcer researchers.

If we adopted this model, many articles reporting incremental advances would no longer be written,

Incremental advances are incredibly important in science. Science doesn't generally work by leaps and bounds, nor should it be forced to. Ten papers that each add a 10 % improvement over the previous technology will, when combined, add up to a huge difference. The faster each gets published, the faster they can be used and the faster the science will progress.

This is one of the daftest ideas I have ever heard of, for many of the reasons already expressed by various posters on the site, but also for another one: Who can predict when a scientist is going to die, or otherwise become unable to work? If one used all one's tickets in one's 20's, 30's and 40's but then, shock horror, managed to make it to the venerable age of 50 but with no more tickets left, what's one to do? Never publish again? But if one adopted a cautious strategy, and kept some tickets back for a late-life surge but was then run over by a bus, what a waste of tickets that would be! Or if one were initially cautious, but contracted a usually fatal illness with a slow course (e.g., a slowly progressing cancer) and felt the need to "use 'em or lose 'em", what would then happen if one recovered (a new drug were developed). What a bittersweet fate!

This is a crazy idea. There ARE problems with the peer-reviewed literature and the publishing practices that prevail today, but this "solution" is very far from useful.

By John Moore (not verified) on 29 Oct 2007 #permalink