Jane - the Journal/Author Name Estimator

Jane is the cool new tool that everyone is talking about - see the commentary on The Tree of Life, on Nature Network and on Of Two Minds.

In short, the Journal/Author Name Estimator is a website where you can type in some text and see which scientific Journal has the content closest to the text you input, as well as people who published on similar topics. If you click on "Show extra options" you can narrow your search by a few criteria, e.g., you can search only Open Access journals.

The idea is to discover journals to which you can submit your work. Most people know the journals available for their stuff, but this is the way to discover new journals, see which are Green or Gold OA, or find a place for a manuscript that has already been rejected by all of your usual venues ;-)

Another use is for editors of broad-topic journals, to find relevant referees for the incoming manuscripts.

So, I did first the obvious test of the site - I copied and pasted the abstract of one of my papers. It gave the correct journal at 100% confidence, and all four co-authors at equal split of 25%. So, it works in that way. The other people mentioned down the list are also relevant researchers who would be appropriate for reviewing such a mansucript.

Then I copied and pasted a small chunk from my unpublished dissertation and got a list of potential reviewers that was pretty much perfect - people I'd suggest if asked.

Then, I typed in a bunch of terms that I know occur frequently in PLoS ONE in different papers, all mixed up - and PLoS ONE came up high on the list (and first among OA journals). So far so good.

Apparently, if you paste non-scientific text, you always get Harvard Business Review - they'll take anything silly. Figures.

So, what would you use it for? Is it useful?

More like this

An obvious application for me is to type a lot of words relevant to your research (general area, particular method, considered applications...) and explore the state of the art. The emphasize is on the "lot of words" (try using more than 10 words related to your topic in a generic search engine or even in the search engine of a particular journal...). It might be especially interesting when exploring new directions (has something similar already been done?) or reviewing a paper in a field you are not entirely familiar with, especially to assess the novelty of the approach.

On a side note, I find it interesting (in a positive way) than when I copied and pasted the abstract of my only published paper, the names of my co-authors with more publications (on similar issues) than me appeared with more confidence than mine, whereas I am the primary author of said paper.

I tried inputting the title and abstract of this paper itself into the tool. Result: the journal this paper was published in was not in the top 15.

Nowhere near an accurate test, but I think it may indicate the following: this tool, even when accurate, does not help with finding the right journals if the paper itself is "original". That is, it cannot be reasonably compared to others just by comparing/matching the words in the title/abstract.

With the exception of the above, I think this tool is good for finding reviewers, checking whether there is a case of plagiarism and, perhaps most significantly, verify whether a paper is in fact original. Ironically taking advantage of its own weakness. If none of the journals or matching papers are anything like it, you know you have gotten your hands on at least an original paper. Whether it is good/significant is another issue, but at least it is original.

Course, this is all assuming the tool is accurate to begin with.

I tried the same experiment as did Coturnix, with a senior/corresponding author publication of mine from last year. JANE came up with the correct journal, but the confidence was relatively low, and the confidence for the authors (including myself) was also low. Perhaps the latter lack of confidence can be explained by the fact that the paper was the result of a direction in my research very different in most ways from my other publications, and I had never published in that journal previously. Ironically, the second best journal was one to which I'd submitted an earlier version of the manuscript, and which had summarily rejected the manuscript without review. Oh well, I was (briefly) angry about the rejection, decided to do additional experiments, and ended up with a pretty nice paper in the end.