John Bohannon’s peer-review sting against Science

October 3, 2013

An extraordinary study has come to light today, showing just how shoddy peer-review standards are at some journals.

Evidently fascinated by Science‘s eagerness to publish the fatally flawed Arsenic Life paper, John Bohannon conceived the idea of constructing a study so incredibly flawed that it didn’t even include a control. His plan was to see whether he could get it past the notoriously lax Science peer-review provided it appealed strongly enough to that journal’s desire for “impact” (designed as the ability to generate headlines) and pandered to its preconceptions (that its own publication model is the best one).

So Bohannon carried out the most flawed study he could imagine: submitting fake papers to open-access journals selected in part from Jeffrey Beall’s list of predatory publishers without sending any of his fake papers to subscription journals, noting that many of the journals accepted the papers, and drawing the flagrantly unsupported conclusion that open-access publishing is flawed.

Incredibly, Science not only published this study, but made it the lead story of today’s issue.

It’s hard to know where Science can go from here. Having fallen for Bohannon’s sting, its credibility is shot to pieces. We can only assume that the AAAS will now be added to Beall’s list of predatory publishers.

Rolling updates

Here are some other responses to the Science story:

62 Responses to “John Bohannon’s peer-review sting against Science

  1. Unfortunately, many colleagues are “buying” the story against poor-quality OA journals. The study only proves the poor quality control of the Science journal itself. A formal letter to Science signed by a significant number of academics should request retraction based on a flawed experimental design that invalidates the study’s conclusions.

  2. Mike Taylor Says:

    That is an excellent idea. I’ll ask around.

  3. Nathan Myers Says:

    Am I confabulating by guessing that Bohannon himself apparently did not realize that his paper was fatally flawed?

    Such brilliant subtlety, one has to admire it.

  4. The study shows in fact that Gold OA journals published English need better peer-review practice. Proof: “To generate a comprehensive list of journals for my investigation, I filtered the DOAJ, eliminating those not published in English and those without standard open-access fees.”

  5. Mike Taylor Says:

    Honestly, it’s the most flagrant case of fishing for a pre-decided result I’ve ever come across. He didn’t look at subscription journals. He deliberately went hunting for OA journals that have been pre-classified as predatory. He eliminated from his sample all the OA journals (more than half of them, remember) that don’t charge APCs. It’s like sample the population of a jail and concluding that criminality is rampant in society. How the hell did Science let themselves fall for this?

  6. Ethan White Says:

    I agree with @Pandelis that there should definitely be one or more formal responses to this story.

  7. Peter Suber has a G+ posting which lists all the flaws. However, even if this is a piece of propaganda, perhaps it’s also an opportunity to showcase modern forms of peer review used by some OA publishers and press for open, technical and perpetual pre and post publication peer review.

  8. Nathan Myers Says:

    An appropriate companion experiment would involve sending this paper to free open-access journals.

  9. This sounds very much like a reenacting of the Sokal hoax, Clearly a lot to learn from it and hopefully an opportunity to improve peer-review processes, but otherwise nothing really new about a certain ‘broken’ character of scholarly communication

  10. Mike Taylor Says:

    Not really much like what Sokal did. He constructed a paper that was, on its own terms, a work of art, and sent it to one specific journal which he actively intended to demonstrate was devoid of intellectual substance. What Bohannon did was a fishing expedition by randomly-generated spam.

  11. Christian Says:

    Who is going to crawl all uni website and check how many people have published in predatory journals. For every article a point deducted in uni rankings.

  12. See, one of his points was evaluating the focus group against the control. But he didn’t do that himself. And the significance of the dataset of journals used versus the total is not high. Not when you had nearly 10k journals to start with, and chosen less than 200 to submit to, with a final pool nearly half that.

  13. Tor Bertin Says:

    Was hoping to see some commentary on this here. I actually laughed out loud at the irony of the paper lacking any kind of control.

  14. brembs Says:

    And that after Science rejected our paper with actual data in it:

  15. […] Bohannon (and Science) for undertaking such a study in the first place, accusing Bohannon of “drawing the flagrantly unsupported concluding [sic] that open-access publishing is flawed,” using the opportunity to come up with an equally unsubstantiated conclusion, that […]

  16. […] of the Science article: Science magazine rejects data, publishes anecdote, by Björn Brembs John Bohannon’s peer review sting against Science, by Mike […]

  17. […] Study flawed, pay-back by Science after the arsenic life story (cf Mike Taylor) […]

  18. […] SV-POW! there’s a list of other reactions to Bohannon’s […]

  19. This is a great example of smash and grab, sensationalist journalism that leaves collateral damage everywhere. I am offended at his method of creating fake authors and institutions hailing from the African continent by jumbling up initials and using African place names. Imagine if he’d chosen China or Japan? He does play one good role in highlighting the increasing number of predatory publishers but the approach taken is like asking scalper for a tax receipt.

  20. […] Mike Taylor – John Bohannon’s peer-review sting against Science […]

  21. Tim Says:

    The “study” of peer review linked to in this page, and published by Science, is very clearly labeled as “News” and not as a research article or report. I think that using this as an example to call into question the scientific peer review process at Science is tenuous at best. Would a news article / opinion / commentary be regularly sent out for peer review?

  22. Mike Taylor Says:

    One that makes claims this bold, and this politically incendiary? Yes, by a responsible journal.

  23. […] than a dozen critiques have been posted to various news sites and blogs suggesting a bias by Science, which charges for subscriptions, against the open-access […]

  24. Many of the points raised about bias & lack of control here are valid, but it wouldn’t kill OA advocates to acknowledge that the plague of crap/spam/predatory pay-to-publish journals is a reasonably serious problem. Bohannon’s article just dramatized it. Furthermore it is a problem that negatively impacts the credibility and future of OA! If I were in the leadership of the OA movement, I would come down, hard, against the rampant abuse of the OA journal idea. At the very least there REALLY ought to be some sort of accreditation body for OA journals, procedures for getting on the good list, etc., so that researchers, administration bean counters, and granting agencies can tell which journals are real, and which are just some guy in India who will publish anything for $1000 of grant money.

    In fact, this idea is so incredibly obvious I can’t understand why it doesn’t exist already. Wouldn’t this be a BIG advantage for promoting OA?

  25. Mike Taylor Says:

    Nicholas, what you’re describing already exists. OASPA is the Open Access Scholarly Publishers Association, and has pretty solid criteria for membership. There’s also DOAJ, the Directory of Open Access Journals, which started out as a more or less unselective list of all OA journals, but has been tightening its criteria over the last year, and has issues several calls for consultation on exactly what those criteria should be. Then there’s Beall’s List of predatory publishers — ones to avoid.

    So anyone who wants to place an article in an OA journal but (for some reason) doesn’t know of a good and appropriate one already would perhaps start with DOAJ to find candidates, filter those candidates by their publishers’ membership of OASPA and toss out those mentioned by Beall to find a reasonable list.

    But Bohannon did the exact opposite. Not only did he not discard the predatory journals on Beall’s list, he actively added them to his sample. Not only that, he removed from the list all the journals that don’t charge and APC (and which therefore have no direct financial incentive to incline towards publishing articles). In short, he pretty much sought out the skankiest set of OA journals he could find.

    So: don’t do that.

  26. Thanks for OASPA — but it obviously needs more publicity! I have not made a study of the issue, but probably like a lot of people I have read a fair bit of commentary on OA — e.g., the Eisen blogs, Beall’s blog, etc., and this is the first I’ve heard of it.

  27. […] of fire – here’s a selection from Peter Suber (measured), Michael Eisen (a little snarky), and Mike Taylor (immensely snarky but also collecting […]

  28. a301khan Says:

    Now we have real and proven data and result of the quality of SOME OA journals (neglecting the fact that it was not compared with similar subscription based journals and other weakness of this study). Even though this experiment is ‘not perfect’, but I am so happy to see that it throwing light on the quality of ‘screening and peer review service’ of OA journals. I strongly believe that the scholarly publishers should work like ‘strict gate keeper’ by arranging honest and sincere peer review service. This is the main difference between a ‘scholarly publisher’ and a ‘generic publisher’ (who publishes anything). Other works like typesetting, proofing, printing, web hosting, marketing, etc are important but not unique for a scholarly publisher. (My personal opinion is that we should not waste time by debating–Good OA, –bad OA–good CA–bad CA, etc. It is the time to work. We must jump to more effectively analyses and use these huge precious data)

    I know and strongly believe that Beall, being an academician-librarian, also gives the highest importance to this particular criteria than any other thing. I congratulate Beall that his theory has been experimentally proven by the Sting Operation of Science.

    I know this sting operation is going to generate a huge debate and one group will try to find out the positive points and other group will try to prove it as bogus experiment. A simple endless and useless fight and wastage of time. It will be more important to find out some way to use this huge data more effectively.

    Now I have some suggestions and questions
    1. How are we going to use these huge data generated by this year long experiment?
    2. I request DOAJ, OASPA to do some constructive works by using this huge data.
    3. Can we develop some useful criteria of screening OA publishers from the learning of this experiment?
    4. Is there any way of rewarding the publishers, who properly and effectively passed this experiment (rejecting the article by proper peer review. I noticed some journals rejected due to ‘scope-mismatch’. It is definitely a criterion of rejection. But it does not answer, if scope was matched, what would happen).
    5. I saw the criticism of Beall that ‘he is ..trigger-happy’. It is now time for Beall to prove that he not only knows to punish the bad OA, but he knows to reward also somebody, if it intends to improve from the previous situation. Is there any possibility that this data can be used for the ‘appeal’ section of Beall’s famous blog. Sometimes judge can free somebody depending on the circumstantial proof, even if he/she does not formally appeal. (Think about posthumous award/ judgment.) I always believe that ‘reward and inspiration of good’ is more powerful than ‘punishment for doing bad’. But I also believe that both should exist.

    If anybody tells that “The results show that Beall is good at spotting publishers with poor quality control.” Then it tells one part of the story. It is only highlighting who failed in this experiment. It is not telling or highlighting about those publishers who passed this experiment but still occupy the seat in Beall’s famous list. I really hate this trend. My cultural belief and traditional learning tells me that “if we see only one lamp in the ocean of darkness, then we must highlight it, as it is the only hope. We must protect and showcase that lamp to defeat the darkness”. I don’t know whether my traditional belief is wrong or right, but will protect this faith till my death.

    I really want to see that Beall officially publishes a white-list of ‘transformed predatory OA publishers’, where he will clearly write the reasons of its removal from the ‘bad list’. So, that from that discussion other lower quality predatory OA publishers will learn how to improve (if they really want to do so) and will learn how to get out of Beall’s ‘bad list’. This will step will essentially complete the circle, Beall started.

    Ideally I STRONGLY BELIEVE that Beall will be the happiest person on earth if in one fine morning Beall’s list of ‘bad OA publishers’ contains ‘zero name’, by transferring them to Beall’s list of “Good OA publishers” by transforming them with the help of effective peer review process initiated by Beall.

    Akbar Khan,

  29. […] 11. Mike Taylor (SV-POW): John Bohannon’s peer-review sting against Science […]

  30. […] | Der Spiegel | Chronicle HE | Ernesto Priego | Gunther Eisenbach | Fabiana Kubke | Zen Faulkes | Mike Taylor | Libération | Rue89 | Le Monde | Nu Wetenschap […]

  31. […] da questo post di Mike Taylor ci si può facilmente rendere conto che queste osservazioni sono soltanto due fra le molte ricevute […]

  32. […] Excellent overview of the reactions can be found on SV-POW […]

  33. […] Bohannon wrote of his findings in Science which has received a lot of flack suggesting a bias by the journal against online […]

  34. […] journals with an overtly-flawed paper about the anti-cancer properties of lichen, has reopened the peer-review debate and generated a large response from OA proponents & […]

  35. […] Numa peça publicada este mês na revista Science (“Quem tem medo da revisão por pares?“), um jornalista (que também é doutor em biologia) abriu fogo contra o que percebeu como um problema dos periódicos de acesso livre (open-access): “pouco ou nenhum escrutínio” do conteúdo dos artigos submetidos para publicação. O artigo expõe os resultados do experimento do jornalista-biólogo, indicando que mais da metade das submissões de um artigo falso e claramente problemático foram aceitas. Sendo a Science uma revista paga, não tardaram a aparecer respostas da comunidade de Acesso Livre indicando problemas metodológicos no experimento e acusações de que o texto (e toda a ideia da experiência) seria tendenciosa. […]

  36. a301khan Says:

    For me it is more important to find out few sources of light in the ocean of darkness. People are more busy to find out the weakness of the study, how this study should have been conducted, etc. Some peoples are considering this as a ‘designer study to produce some designed baby’, etc. And I AGREE to all of them. Yes all of them are true. But in this huge quarrel and cacophony are we not neglecting some orphan babies born from this study (yes they born accidentally and not designed or expected to be born: as most of the large inventions are accidentally happened)?

    I have made some childish analysis with the raw-data of the report of John Bohannon.

    Bohannon used very few words for praising or highlighting the journals/publishers who successfully passed the test. He only mentioned about PlOS One and Hindawi, who are already accepted by academicians for their high reputation. At least I expected that Bohannon will include a table to highlight the journals/publishers, who passed test. I spent very little time to analyze the data. Surprisingly I found some errors made by Bohannon to rightly indicate the category of publishers (DOAJ / Beall). I have indicated some errors and I could not complete the cross-checking of all 304 publishers/journal. Bohannon used DOAJ/Beall as his main category of selecting the journals. But error in properly showing this category-data, may indicate that he spent more time in collecting the raw data, than analyzing the data or curating the data.

    I found more members of Beall’s list is present in Bohannon’s study. But Bohannon did not report this fact.

    Table 1: List of 20 journals/publishers, who Rejected the paper after substantial review (May be considered white-listed journal/publisher)
    Table 2: List of 8 journals/publishers, who Rejected the paper after superficial review (May be considered white-listed-borderline journal/publisher)
    Table 3: List of 16 journals/publishers, who Accepted the paper after substantial review (May be considered blacklisted-borderline journal/publisher)
    Table 4: List of journals/publishers, who Accepted the paper superficial/NO review (May be considered confirmed blacklisted journal/publisher)
    Table 5: List of journals/publishers, who Rejected the paper but no review details recorded (Labeling of this journal/publisher is avoided)

    Link to my post:

    Akbar Khan

  37. […] 2013/10/03: SVPOW: John Bohannon’s peer-review sting against Science […]

  38. […] have been lots of criticisms aimed at John Bohannon's paper and also at Science, for publishing a flawed piece of […]

  39. Simon Miller Says:

    I’m asking myself who is behind all this? Publishers who want to erode open access movement? Or an OA activist who wants to instigate a discussion on quality of OA material?
    Provoking question: how many commercial publishers would have accepted this article in their “peer-reviewed” journals? Did anyone ever try it the other way round?

  40. Simon,

    a number of commercial publishers did accept this paper! To quote from Bohannon’s article:

    “Journals published by Elsevier, Wolters Kluwer, and Sage all accepted my bogus paper.”

  41. A Vikram Says:

    In the article john has have written that “*But even when editors and bank accounts are in the developing world, the company that ultimately reaps the profits may be based in the United States or Europe*” page 65 second paragraph. I was curious to know about the proof of this information and if collected some information to draw that conclusion. article.

  42. a301khan Says:

    Recently an interview with John Bohannon, Post Open Access Sting, was published here ( Some useful information and actions were recorded there as
    1. Decision of OASPA to terminate the membership of Dove Press and Hikari, as a result of the sting investigation.
    2. I also support that as a result of Bohannon investigation, DOAJ has removed 114 OA journals from its list.

    Weeding is always necessary. OASPA and DOAJ is taking action to correct its list. Nobody is perfect and revision is always necessary once a kind of ‘peer-review report’ is available. But does anybody know that J Beall has taken any action to correct his famous list? The sting operation and all related discussions on internet is inclined to highlight who failed in this experiment. It is not telling or highlighting about those publishers who passed this experiment but still occupy the seat in Beall’s famous list. Phil Davis also reported that Beall is falsely accusing nearly one in five as being a ”potential, possible, or probable predatory scholarly open access publisher”. (Reference:

    Now in response of my comment one commenter marked me as very unfair, as “Just because some of the journals on Beall’s list rejected the Bohannon article doesn’t preclude them from being “potential, possible, or probable predatory scholarly open access publishers”. This comment was approved the blog moderators. But my response to this comment was deleted. May I request Mike to allow my comment.

    My deleted comment: My apology please. I thought If “acceptance” of the fraud paper is sufficient enough for terminating the membership of 2 publishers of OASPA and cancellation of 114 OA journals from DOAJ, then “Rejection with robust peer review” should be also sufficient enough for the opposite. I forgot that being a GOLD OA publisher means you should be ready for punishment for single mistake. YES THEY MUST BE PUNISHED. I also forgot that if a small GOLD OA publisher shows evidence of ‘robust peer review’ it must be accidental. It must have to be accidental. I forgot that these small GOLD OA publishers have no right to show good intention to improve from previous error, no right to show the evidence of ‘robust peer review’. Once they are stamped as predatory they must have to be predatory for their life time. (I also forgot that if these predatory small gold OA publishers improve and try to become good publisher, then Anti-OPEN access brigade will lose its most powerful weapon till date).

    (sorry for a long post)

    Akbar Khan

  43. Mike Taylor Says:

    Thanks for that, Akbar. As a matter of policy, we allow all comments except spam and direct personal attacks, so you don’t need to worry about having your comments mysteriously disappear here. (One of the reasons I don’t bother commenting much at The Scholarly Kitchen is their rather arbitrary and capricious moderation. It’s not just that I never know whether my comments will appear; it’s that when I comment there at all, I have one eye on getting past their filter rather than both eyes on what I actually want to say, and that’s no way to develop an honest argument.)

  44. Akbar Khan Says:

    Thank you so much Mike. Open access and Open (unrestricted) flow of thoughts are really important. Blog is essentially a discussion forum. I have seen some bloggers (mainly anti-open access brigade) has a tendency to delete comments which do not endorse their ideas.
    There may be a commenting policy. And the moderator should have the courtesy to mail the commenter (may be one line explanation or two three words), if his or her comment is deleted.
    (I am sorry for my poor English quality)

    Akbar Khan

  45. a301khan Says:

    Hi Mike,
    I just saw my mail. After you allowed my deleted comment to be posted in your blog, I got a mail from David Crotty of scholarlykitchen (Time: Sat, Nov 16, 2013 at 12:20 AM) with a request to make my comments less sarcastic. But honestly speaking I don’t think it is sarcastic. It may be very DIRECT description of some facts. Yes it is not sarcastic. It is DIRECT description of some tragic facts of few (very small numbers) small GOLD OA publishers, who are trying hard to become good publisher. But no body is there to help, even though they are showing some solid proof of improvement, as recorded by Bohannon. Is it not tragic?

    Akbar Khan,

  46. Mike Taylor Says:

    That seems an odd kind of control for blog owners to want to wield. But, what the hey, it’s their blog: they can (and do) run it how they like. As I said, it’s a major reason that I comment so little there.

  47. […] For the most succinct criticism I’ve ever read of the “sting” operation, as well as an up-to-date list of other responses to Bohannon, see: […]

  48. […] *****SHOEHORN GOES HERE***** And Person of 2013 goes to Edward Sno- Haha. Just kidding. I’d rather give to Doge Those adorable pups are more deserving. But unfair jabs aside. Person of 2013 for me anyway – John Bohannon […]

  49. Patrik Bavoil Says:

    I would reword your concluding statement to: “It’s hard to know where Bohannon can go from here. Having fallen for Science’s sting, his credibility is shot to pieces. We can only assume that the AAAS will now be added to Beall’s list of predatory publishers.”

    Science, Nature and the like have been predatory publishers for eons: these journals are the tabloids of science and they have done more to discredit the peer-review system than all OA journals put together. (I decline to review for these journals these days as I know from the start that the chances of my review being ignored are high). The bulk of serious scientific publishing is fortunately not done by tabloids or journals that do not use peer-review. It is done mostly by the silent majority of conscientious publishers (yes there are some!), often associated with societies that return their share of the profits to science in one way or another. Peer-review is imperfect by definition, but it is still the best we have, and I have not seen a viable alternative as of yet.

  50. […] me. While there has recently been much buzz about scams by new OA journals, especially with the Science Sting by Bohannon, the biggest scam is the one by subscription journals. Many erroneously assume that only open […]

  51. Nadya Says:

    Why don’t anyone do the same with commercial journals? It is not so difficult to make another set of fake papers.

  52. […] year OA copped a bad name because of the ‘sting’ by Bohannon where some of the (only OA journals) that were sent a scientifically unviable article accepted it […]

  53. […] revisit journalist John Bohannon’s undercover investigative report from 2013 discussed further in Michael Eisen’s blog which caused quite a stir and opened the […]

  54. […] dass sich die (Preissetzungs-)Macht der Verlage auf die Autorengebühren ausdehnt. Ein weiteres – umstrittenes – Argument ist, dass das “author pays” Modell die Gefahr sinkender […]

  55. […] by a member of the editorial staff of the journal Science (John Bohannon) and the counters by Mike Eisen and a bigger followup by Randy Schenkman. A nice review of similar stings in the past (submitting […]

  56. […] Bohannon’s approach, which have been ably summarized elsewhere. In particular, since Bohannon didn’t include a “control group” of traditional subscription journals, there’s no evidence that open access peer review […]

  57. […] perception with authors is often that all Open Access Publishing is funded like this. Recent controversies about fraudulent publishers charging APCs without offering quality publishing have done the image of […]

  58. […] However, while an excellent resource, please be aware that Beall’s list was not 100% airtight, casting doubt on the validity of the current Stop Predatory Journals list. In 2013, a fake scientific paper with “hopelessly flawed” experiments and meaningless results was submitted to 121 publishers identified as predatory by Beall—82% of the publishers that completed the review process accepted the paper (see here). While considered a validation of Beall’s list by many, this finding also showed that the list contained a number of false positives. As argued by Phil Davis, it showed that Beall was “falsely accusing nearly one in five” publishers of being predatory when they clearly had effective quality control systems in place. Please note that there was also some criticism of this ‘sting’ investigation, e.g., the lack of a control group (discussed here). […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: