Tired of submitting to the same old journals? JournalGuide wants to help.

I just got back from the big Science Online conference in Raleigh, NC, and when I was there I ran into someone who is working on the JournalGuide project. What is it, you ask? It’s a website where you can input your manuscript’s title and abstract/key words and get back a list of journals that publish similar content, with info on cost, impact factor, open access status, etc.

Who will this service help, you may ask! After all, most of us researchers are already pretty familiar with the journals in our field. I think there may still be a market for this, though.

Some of us work in small fields, with relatively few journals. I publish in one field in which there is one go-to journal for solid-but-not-paradigm-shifting manuscripts. If your article happens to get rejected by that journal, the choices get unattractive pretty quickly. In terms of readership, the audience for your findings drops precipitously. Also, since there is only that one journal everyone wants to publish one, a lot of us end up submitting there over and over. It gets boring! Lately, for these reasons, we have been trying to generate some new potential journal ideas. I think JournalGuide could help in a situation like this.

A feature I’m more interested in, though, and one that is not functional yet, is the journal rating system. You can create an account and anonymously rate your submission experience (the site is supposed to keep track of postings to weed out trolls). At some point, the JournalGuide is supposed to aggregate the results and make them available. I’m not sure when this will happen–the site says late 2013, and I’m writing this in March 2014–but I’m looking forward to it!

Right now, most of us depend on word of mouth to figure out which submission processes are so onerous that it’s best to just avoid a journal all together. You know what I’m talking about. The two week review process that often turns into four months somehow. The editor who seems to regularly lose track of submissions. The journal that wants nothing to do with negative results, whatever its stated policies. Then again, there are also the journals that end up shocking you with just how smooth their submission process actually is. It would be nice to have a more systematic process for collecting and making available information about how efficient and fair a journal is. I’m curious about whether this system will catch on, since its success really depends on the number of users who buy in.

Anybody else used this site or planning on using it?