A few months ago, Matt and Darren saw a picture someone had done of an Apatosaurus with huge neck-flaps. Since they, they’ve tried to find it again but without success. Then, happily, I stumbled across it in this All Yesterdays review, so here it is:

tumblr_mdqur507GE1rgw4eto1_500

Unfortunately, I can’t tell you much about it. I know it’s the work of Emiliano Troco, but I’ve not been able to find his web-site, nor a description of the piece, nor a version in a decent resolution. So all we have to go on at the moment is this thumbnail. If you know more, please leave a comment!

Is it credible? Who’s to say? The one thing we know for certain about Apatosaurus is that it had truly crazy cervical vertebrae, unlike those of any other animal. In our recent arXiv paper, we wrote:

It is difficult to see the benefit in Apatosaurus excelsus of cervical ribs held so far below the centrum – an arrangement that seems to make little sense from any mechanical perspective, and may have to be written off as an inexplicable consequence of sexual selection or species recognition.

It certainly seems to have been doing something weird with its neck. It’s not obvious why big flaps like these would require honking great cervicals ribs to hang down from, but maybe it was swinging them around or something?

[We’ve featured bizarrely ornamented sauropods here before, notably Brian Engh’s pouch-throated Sauroposeidon.]

There’s been a lot of concern in some corners of the world about the Finch Report‘s preference for Gold open access, and the RCUK policy‘s similar leaning. Much of the complaining has focussed on the cost of Gold OA publishing: Article Processing Charges (APCs) are very offputting to researchers with limited budgets. I thought it would be useful to provide a page that I (and you) can link to when facing such concerns.

This is long and (frankly) a bit boring. But I think it’s important and needs saying.

1. How much does the Finch Report suggest APCs cost?

Worries about high publishing costs are exacerbated by the widely reported estimate of £2000 for a typical APC, attributed to the Finch Report. In fact, that is not quite what the report (page 61) says:

Subsequent reports also suggest that the costs for open access journals average between £1.5k and £2k, which is broadly in line with the average level of APCs paid by the Wellcome Trust in 2010, at just under £1.5k.

Still, the midpoint of Finch’s “£1.5k-£2k” range is £1750, which is still a hefty amount. Where does it come from? A footnote elucidates:

Houghton J et al, op cit; Heading for the Open Road: costs and benefits of transitions in scholarly communications, RIN, PRC, Wellcome Trust, JISC, RLUK, 2011. See also Solomon, D, and Björk, B-Christer,. A study of Open Access Journals using article processing charges. Journal of the American Society for Information Science and Technology , which suggests an average level of APCs for open access journal (including those published at very low cost in developing countries) of just over $900. It is difficult to judge – opinions differ – whether costs for open access journals are on average likely to rise as higher status journals join the open access ranks; or to fall as new entrants come into the market.

[An aside: these details would probably be better known, and the details of the Finch report would be discussed in a more informed way, if the report were available on the Web in a form where individual sections could be linked, rather than only as a PDF.]

The first two cited sources look good and authoritative, being from JISC and a combination of well-respected research organisations. Nevertheless, the high figure that they cite is misleading, and unnecessarily alarming, for several reasons.

2. Why the Finch estimate is misleading

2.1. It ignores free-to-the-author journals.

The Solomon and Björk analysis that the Finch Report rather brushes over is the only one of the three to have attempted any rigorous numerical analysis, and it found as follows (citing an earlier study, subsequently written up):

Almost 23,000 authors who had published an article in an OA journal where asked about how much they had paid. Half of the authors had not paid any fee at all, and only 10% had paid fees exceeding 1,000 Euros [= £812, less than half of the midpoint of Finch’s range].

And the proportion of journals that charge no APC (as opposed to authors who paid no fee) is even higher — nearly three quarters:

As of August 2011 there were 1,825 journals listed in the Directory of Open Access Journals (DOAJ) that, at least by self-report, charge APCs. These represent just over 26% of all DOAJ journals.

So there are a lot of a zero-cost options. And there are by no means all low-quality journals: they include, for example, Acta Palaeontologica Polonica and Palaeontologia Electronica in our own field of palaeontology, the Journal of Machine Learning Research in computer science and Theory and Applications of Categories in maths.

2.2. It ignores the low average price found by the Solomon and Björk analysis.

The Solomon and Björk paper is full of useful information and well worth detailed consideration. They make it clear in their methodology section that their sample was limited only to those journals that charge a non-zero APC, and their analysis concluded:

[We studied] 1,370 journals that published 100,697 articles in 2010. The average APC was 906 US Dollars (USD) calculated over journals and 904 US Dollars USD calculated over articles.

(The closeness of the average across journals and dollars is important: it shows that the average-by-journals is not being artificially depressed by a large number of very low-volume journals that have low APCs.)

2.3. It focusses on authors who are spending Other People’s Money.

Recall that Finch’s “£1.5k-£2k” estimate is justified in part by the observation that the APC paid by the Wellcome Trust in 2010 was just under £1.5k. But it’s well established that people spending Other People’s Money get less good value than when they spend their own: that’s why travellers who fly business class when their employer is paying go coach when they’re paying for themselves. (This is an example of the principal-agent problem.)

It’s great that the Wellcome Trust, and some other funders, pay Gold OA fees. For researchers in this situation, APCs should not be problem; but for the rest of us (and, yes, that includes me — I’ve never had a grant in my life) there are plenty of excellent lower-cost options.

And as noted above, lower cost, or even no cost, does not need to mean lower quality.

2.4. It ignores the world’s leading open-access journal.

PLOS ONE publishes more articles than any other journal in the world, has very high production values, and for those who care about such things has a higher impact-factor than almost any specialist palaeontology journal. Its APC is $1350, which is currently about £839 — less than half of the midpoint of Finch’s “£1.5k-£2k” range.

Even PLOS’s flagship journal — PLOS Biology, which is ranked top in the JCR’s biology section, charges $2900, about £1802, which is well within the Finch range.

Meanwhile, over in the humanities (where much of the negative reaction to Finch and RCUK is to be found), the leading open-access megajournal is much cheaper even than PLOS ONE: SAGE Open currently offers an introductory APC of $195 (discounted from the regular price of $695).

2.5. It ignores waivers

The most important, and most consistently overlooked fact among those who complain about how they don’t have any funds for Gold-OA publishing is that many Gold-OA journals offer waivers.

For example, PLOS co-founder Michael Eisen affirms (pers. comm.) that it’s explicitly part of the PLOS philosophy that no-one should be prevented from publishing in a PLOS journal by financial issues. And that philosophy is implemented in the PLOS policy of offering waivers to anyone who asks for one. (For example, my old University of Portsmouth colleagues, Mark Witton and Darren Naish certainly had no funds from UoP to support publication of their azhdarchid palaeobiology paper in PLOS ONE; they asked for a waiver and got it, no questions asked.)

Other major open-access publishers have similar polices.

2.6. It doesn’t recognise how the publishing landscape is changing.

It’s not really a criticism of the Finch Report — at least, not a fair one — that its coverage of eLife and PeerJ is limited to a single passing mention on page 58. Neither of these initiatives had come into existence when the report was drafted. Nevertheless, they have quickly become hugely important in shaping the world of publishing — it’s not a stretch to say that they have already joined BMC and PLOS in defining the shape of the open access world.

For the first few years of operation, eLife is waiving all APCs. It remains to be seen what will happen after that, but I think there are signs that their goal may be to retain the no-APC model indefinitely. PeerJ does charge, but is ridiculously cheap: a one-off payment of $99 pays for a publication every year for life; or $299 for any number of publications at any time. Those numbers are going to skew the average APC way, way down even from their current low levels.

2.7. I suspect it concentrates on hybrid-OA journals.

There are all sorts of reasons to mistrust hybrid journals, including the difficulty of finding the open articles; the very high APCs that they charge is only one.

Why do people use hybrid journals when they are more expensive than fully OA journals and offer so much less (e.g. limited length, no colour, number of figures)? I suspect hybrid OA is the lazy option for researchers who have to conform to an OA mandate but don’t want to invest any time or effort in thinking about open-access options. It’s easy to imagine such researchers just shoving their work into in the traditional paywalled journal, and letting the Wellcome grant pick up the tab. After all, it’s Other People’s Money.

If grant-money for funding APCs becomes more scarce as it’s required to stretch further, then researchers who’ve been taking this sort of box-checking approach to fulfilling OA mandates are going to be forced to think more about what they’re doing. And that’s a good thing.

3. What is the true average cost?

If we put all this together, and assume that researchers working from RCUK funds will make some kind of effort to find good-value open-access journals for their work instead of blindly throwing it at traditional subscription journals and expecting RCUK to pick up the fee, here’s where we land up.

  • About half of authors currently pay no fee at all.
  • Among those that do pay a fee, the average is $906.
  • So the overall average fee is about $453.
  • That’s about £283, which is less than one sixth of what Finch suggests.

4. What are we comparing with?

It’s one thing to find a more realistic cost for an average open-access article. But we also need to realise that we’re not comparing with zero. Authors have always paid publication fees in certain circumstances — subscription journals have levied page charges, extra costs for going past a certain length, for colour figures, etc. For example, Elsevier’s American Journal of Pathology charges authors “$550 per color figure, $50 per black & white or grayscale figure, and $50 per composed table, per printed page”. So a single colour figure in that journal costs more than the whole of a typical OA article.

But that’s not the real cost to compare with.

The real cost is what the world at large pays for each paywalled article. As we discussed here in some detail, the aggregate subscription paid to access an average paywalled article is about $5333. That’s as much as it costs to publish nearly twelve average open-access articles — and for that, you get much less: people outside of universities can’t get it even after the $5333 has been paid.

5. Directing our anger properly

Now think about this: the Big Four academic publishers have profit-margins between 32.4% and 42%. Let’s pick an typical profit margin of 37% — a little below the middle of that range. Assuming this is pretty representative across all subscription publishers — and it will be, since the Big Four control so much of the volume of subscription publishing — that means that 37% of the $5333 of an average paywalled article’s subscription money is pure profit. So $1973 is leaving academia every time a paper is “published” behind a paywall.

So every time a university sends a paper behind a paywall, the $1973 that it burns could have funded four average-priced Gold-OA APCs. Heck, even if you want to discount all the small publishers and put everything in PLOS — never taking a waiver — it would pay for one and a half PLOS ONE articles.

So let me leave you with this. In recent weeks, I’ve seen a fair bit of anger directed at the Finch Report and the RCUK policy. Some researchers have been up in arms at the prospect of having to “pay to say“. I want to suggest that this anger is misdirected. Rather than being angry with a policy that says you need to find $453 when you publish, direct your anger at publishers who remove $1973 from academia every time you give them a paper.

Folks, we have to have the vision to look beyond what is happening right now in our departments. Gold OA does, for sure, mean a small amount of short-term pain. It also means a massive long-term win for us all.

I just saw this tweet from palaeohistologist Sarah Werning, and it summed up what science is all about so well that I wanted to give it wider and more permanent coverage:

This is exactly right. Kudos to Sarah for saying it so beautifully.

(Sarah’s work can most recently be seen in Nesbitt et al.’s (2012) paper on a newly recognised ancient dinosaur or near dinosaur relative, and especially in the high-resolution supplementary images that she deposited at MorphoBank.)


[backup image]

The problem

Its often been noted that under the author-pays model of publication (Gold open access), journals have a financial incentive to publish as many articles as possible so as to collect as many article processing charges as possible. In the early days of PLOS ONE, Nature famously described it as “relying on bulk, cheap publishing of lower quality papers“.

As the subsequent runaway success of PLOS ONE has shown, that fear was misplaced: real journals will always value their reputation above quick-hit APC income, and that’s reflected by the fact that PLOS ONE papers are cited more often, on average, than those in quality traditional palaeo journals such as JVP and Palaeontology.

But the general concern remains a real one: for every PLOS, there is a Bentham Open. It’s true that anyone who wants to publish an academic paper, however shoddy, will certainly be able to find a “publisher” of some kind that will take it — for a fee. This problem of “predatory publishers” was highlighted in a Nature column three months ago; and the ethical standard of some of the publishers in question was neatly highlighted as they contributed comments on that column, posing as well-known open-access advocates.

A solution

The author of that Nature column, Jeffrey Beall, maintains Beall’s List of Predatory, Open-Access Publishers, a useful annotated list of publishers that he has judged to fall into this vanity-publishing category. The idea is that if you’re considering submitting a manuscript to a journal that you don’t already know and trust, you can consult the list to see whether its publisher is reputable.

[An aside: I find a simpler solution is not to send my work to journals that I don’t already know and trust, and I don’t really understand why anyone would do that.]

Towards a better solution

Beall’s list has done sterling work over the last few years, but as the number of open access publishers keeps growing, it’s starting to creak. It’s not really possible for one person to keep track of the entire field. More important, it’s not really desirable for any one person to exercise so much power over the reputation of publishers. For example, the comment trail on the list shows that Hindawi was quite unjustly included for some time, and even now remains on the “watchlist” despite having a good reputation elsewhere.

We live in a connected and open world, where crowdsourcing has built the world’s greatest encyclopaedia, funded research projects and assembled the best database of resources for solving programming problems. We ought to be able to do better together than any one person can do alone — giving us better coverage, and freeing the resource from the potential of bias, whether intended or unintended.

I’ve had this idea floating around for a while, but I was nudged into action today by a Twitter discussion with Richard Poynder and Cameron Neylon. [I wish there was a good way to link to a tree-structured branching discussion on Twitter. If you want to try to reconstruct it you could start here and trying following some links.]

What might a better solution look like?

Richard was cautious about how this might work, as he should be. He suggested a wiki at first, but I think we’d need something more structured, because wikis suffer from last-edit-wins syndrome. I imagine some kind of voting system — perhaps resembling how stories are voted up and down (and commented on) in Reddit, or maybe more like the way questions are handled in Stack Exchange.

Either way, it would be better if we could use and adapt an existing service rather than building new software from the ground up (even though that’s always my natural tendency). Maybe better still would be to use an existing hosted service: for example, we might be able to get a surprisingly long way just by creating a subreddit, posting an entry for each publisher, then commenting and voting as in any other subreddit.

Cameron had another concern: that it’s hard to build and maintain a blacklist, because the number of predatory publishers is potentially unlimited. There are other reasons to prefer a whitelist — it’s nice to be positive! — and Cameron suggested that the membership of the Open Access Scholarly Publishers Association (OASPA) might make a good starting point.

My feeling is that, while a good solution could certainly say positive things about good publishers as well as negative things about bad publishers, we do need it to produce (among other things) a blacklist, if only to be an alternative to Beall’s one. Since that’s the only game in town, it has altogether too much power at the moment. Richard Poynder distrusts voting systems, but the current state of the art when it comes to predatory publisher lists is that we have a one-man-one-vote system, and Jeffrey Beall is the one man who has the one vote.

We have to be able to do better than that.

Thoughts? Ideas? Suggestions? Offers? Disagreements?