Citation stacking

From the dept of general-fun-but-with-a-serious-message: Retraction Watch on a somewhat unusual case: "Journal retracts two papers after being caught manipulating citations":

Mauricio Rocha-e-Silva ... and several other editors published articles containing hundreds of references to papers in each others’ journals — in order, he says, to elevate the journals’ impact factors. Because each article avoided citing papers published by its own journal, the agreement flew under the radar of analyses that spot extremes in self-citation — until 19 June, when the pattern was discovered. Thomson Reuters... had designed a program to spot concentrated bursts of citations from one journal to another, a practice that it has dubbed ‘citation stacking’. Four Brazilian journals were among 14 to have their impact factors suspended for a year for such stacking. And in July, Rocha-e-Silva was fired from his position as editor of one of them, the journal Clinics, based in São Paulo.

There's the other side, of course:

Rocha-e-Silva says the agreement grew out of frustration with his country’s fixation on impact factor. In Brazil, an agency in the education ministry, called CAPES, evaluates graduate programmes in part by the impact factors of the journals in which students publish research. As emerging Brazilian journals are in the lowest ranks, few graduates want to publish in them. This vicious cycle, in his view, prevents local journals improving.

OTOH... what, nowadays, is the point of having local journals, other than to enhance the CVs of local scientists and institutions?

More like this

I buried this among a bunch of other cool links yesterday, but there was a study the other day, in the Journal of Cell Biology, that seriously calls in question the methodology used by Thompson Scientific to calculate the sacred Impact Factor, the magic number that makes and breaks lives and…
What goes into a journal's impact factor? It turns out that this is a good question. These impact factors are calculated by Thomsom Scientific and attempt to quantify the import of any particular scientific journal. But did anyone read this commentary in the December 17th issue of JCB? It's a…
I've written about journal impact factors before, largely to argue that there are better statistics than the traditional impact factor. But an excellent editorial in the Oct. 10 issue of Science by Kai Simons points out a very obvious problem with how impact factors are used (italics mine):…
Kevin Zelnio of Deep Sea News tweeted the title of this piece and sent my mind going over the various theories of citation, what citations mean, studies showing how people cite without reading (pdf) (or at least propagate obvious citation errors), and also how people use things but don't cite them…

In Japan, it's a sort of national pride thing. No-one here cares about scientific impact in the standard western sense, it's much more about bolstering Japan's self-image. Of course, it's also an opportunity for CV-filling too.

[I suspect that's the only reason the Brazilans have their journals, too. In the Japanese case I presume the national ones are in Japanese - thus guaranteeing that no-one else will ever read them? -W]

By James Annan (not verified) on 02 Sep 2013 #permalink

My first reaction was "sheesh! yet another way to cheat!"

On further thought, there's something inherently broken about any system that depends on a "MORE" algorithm, as in "more impact-factor than someone else." At minimum it's a misapplication of Darwinian selection, and at worst it becomes generalized in the culture and produces exponentials, which both Einstein the pacifist and Teller the Cold Warrior agreed are "dangerous math" (as the climate crisis demonstrates in a different context).

What's needed are fixed and objective standards for performance, such as "quantity X of new research output over quantity Y of time," with allowances for projects that take more or less time than an objective average. Another (probably better) way to do this would be "quantity X of time per year" devoted to research, with bonus time credit for publication in top journals, and time penalty for rejection from publication, with the exception of rejections due to null findings (to compensate for journal bias against reporting null results: these need to be reported if for no other reason than to save time-wastage at duplicated effort).

In general the academic world suffers from the "artisan labor syndrome" whereby "love of one's work" is expected to trade off for lower pay and/or longer hours and/or other onerous conditions. If the top-level administrations of universities expect to be paid on a corporate level, then consistency requires they treat their employees in a manner consistent with corporate best-practices: objective requirements, clear work rules, no fuzzy subjective stuff in the mix.

That won't eliminate all forms of cheating, but it will remove the "I'm being screwed so to hell with the rules" incentive for some of it.

["love of one's work" is expected to trade off for lower pay... - well isn't this fair enough? Pay is (a) to keep you alive and (b) to keep you from going to a better-paid job that you'd prefer equally well. Higher pay can compensate for poor working conditions. But the flipside is that low pay can be compensated for by either good working conditions - nice comfy office, access to internet, few deadlines - and a job you love. Quite a few people who work for BAS stay there because of the trips South -W]

J journals are a variety of E, J or even bilingual - but it hardly matters. the impact factors are miserable (insert std disclaimer about the value of citation metrics - but if you only get one citation, you probably didn't have much impact!)

I just politely declined an invitation to co-author in a *new* J journal...really, why do they bother?

By James Annan (not verified) on 03 Sep 2013 #permalink

For the same reason that junk mail and spam work. The cost of an Email is zilch, if the yield is 390 ppm profit.

By Eli Rabett (not verified) on 06 Sep 2013 #permalink

Japanese journals (also Korean/Chinese/etc) in Eli's fields are important in the following sense, they are not terse. One often finds important details. Of course, that are the translated ones, but Eli can read a graph, which helps.

Before the fall of the Soviet Union, one often found the same thing in Russian journals, which made them worth looking at if you had a particular point that you were looking for The Rabett wonders if Steve McI was smart enough to go hunting for Yamal in those places.

By Eli Rabett (not verified) on 06 Sep 2013 #permalink