What should we do now Beall’s List has gone?

January 26, 2017

It’s now been widely discussed that Jeffrey Beall’s list of predatory and questionable open-access publishers — Beall’s List for short — has suddenly and abruptly gone away. No-one really knows why, but there are rumblings that he has been hit with a legal threat that he doesn’t want to defend.

To get this out of the way: it’s always a bad thing when legal threats make information quietly disappear; to that extent, at least, Beall has my sympathy.

That said — over all, I think making Beall’s List was probably not a good thing to do in the first place, being an essentially negative approach, as opposed to DOAJ’s more constructive whitelisting approach. But under Beall’s sole stewardship it was a disaster, due to his well-known ideological opposition to all open access. So I think it’s a net win that the list is gone.

But, more than that, I would prefer that it not be replaced.

Researchers need to learn the very very basic research skills required to tell a real journal from a fake one. Giving them a blacklist or a whitelist only conceals the real issue, which is that you need those skills if you’re going to be a researcher.

Finally, and I’m sorry if this is harsh, I have very little sympathy with anyone who is caught by a predatory journal. Why would you be so stupid? How can you expect to have a future as a researcher if your critical thinking skills are that lame? Think Check Submit is all the guidance that anyone needs; and frankly much more than people really need.

Here is the only thing you need to know, in order to avoid predatory journals, whether open-access or subscription-based: if you are not already familiar with a journal — because it’s published research you respect, or colleagues who you respect have published in it or are on the editorial board — then do not submit your work to that journal.

It really is that simple.

So what should we do now Beall’s List has gone? Nothing. Don’t replace it. Just teach researchers how to do research. (And supervisors who are not doing that already are not doing their jobs.)

 

31 Responses to “What should we do now Beall’s List has gone?”

  1. protohedgehog Says:

    Agree with all of this. Instead of white-listing or black-listing things, just use common sense.

    Besides, research has shown that the issue of predatory publishing is generally overblown – see the section on Deceptive Publishing here http://f1000research.com/articles/5-632/v3

  2. Stuart Says:

    Hear hear! Very well put. Couldn’t agree more.

  3. Sympathy Says:

    “Just use common sense and critical thinking”
    Sure. But how much time do I need to invest researching journals before accepting requests to review papers, for example?
    It would be nice to spend my critical thinking time primarily on research. A lot of people only review for journals they know, which solves the problem.
    Unlike you folks, not everybody can afford to publish everything in plos one. So when we send a paper out for the eleventh time and the editor rejects it without review, it would be nice to know where we can find a home for papers without spending so much time. Summarized journal policies, procedures, and deficiencies would be tremendously useful for so many reasons. Maybe we want to retain copyright, and go for open access, but Plosone and PeerJ gave us a reviewer that wants $300,000 in additional experiments. That never happens to you? It must not exist! No sympathy!

  4. Mike Taylor Says:

    Reviewing requests are in interesting aspect of this, and one that never seems to get discussed. Thanks for bringing it up.

    I seem to recall having been asked a few times by obviously fake journals to review for them, and turning them down — but it was long enough ago that I can’t remember now why they were obviously fake. I think I’d start by taking a look at a couple of their published papers, then see what the journal has to say about itself. For a lot of them, even the name is enough to tip you off that no serious brainwork has gone into it.

    It would be nice to spend my critical thinking time primarily on research.

    I do agree, but the reality is that we already have to invest a lot of our critical-thinking time on non-research activities related to publishing: writing, editing, figure preparation, referencing, formatting, submitting, revising, rebutting, resubmitting, proof-reading, etc. What you’re describing here is just one more element in this cognitive tax.

    Unlike you folks, not everybody can afford to publish everything in plos one.

    Haha, I am not sure where you got your information about me, but I am 100% self-funded, and have never paid an APC out of my own pocket. My only out-of-pocket expense in publishing has been $99 for my PeerJ membership, which gives me the right to publish indefinitely with them.

    So when we send a paper out for the eleventh time and the editor rejects it without review, it would be nice to know where we can find a home for papers without spending so much time.

    If you’re sending the same paper out eleven times, it can only be because you’re “working down the ladder”. That is a dumb game. The only winning move is not to play.

    Summarized journal policies, procedures, and deficiencies would be tremendously useful for so many reasons.

    I strongly agree! It would be a great leap forward if someone were to make a journal database that is not a whitelist or blacklist, but simply machine-readable facts about every journal — so you could, for example, filter on journals that allow you to retain copyright, or that allow colour figures, or that don’t have a page limit of 10 pages or less. Building this would be expensive (though crowdsourcing the effort would be possible) — but the benefits would be enormous. I wish an enterprising funder would make it happen — someone like the Wellcome Trust or the Gates Foundation.

    … but Plosone and PeerJ gave us a reviewer that wants $300,000 in additional experiments. That never happens to you? It must not exist! No sympathy!

    As it happens, I am right now in the middle of a major slump in my own research, due to two sets of very critical reviews I received for papers that I’d submitted to PeerJ. They don’t demand that I spend money on experiments, but they do require a lot of additional work. (Note: for one of the papers involved, they are right to: it’s a classic example of a tough-but-fair review.) I think that goes with the territory, doesn’t it? If we sign up to pre-publication peer-review, we have to be prepared to suffer the consequences of pre-publication peer-review — whatever the journal.

  5. Pete Says:

    Apply critical thinking to all publishing avenues, and white / black lists too.

  6. No sympathy Says:

    1. Beall must be held accountable, and offer the public an explanation, and an apology for his silence. Not giving the global academic community, many of whom he ravaged, any notice that he was going to pull the plug is simply cowardice.

    2. He should be criminally prosecuted. All the information on his blog, including comments, is public information that he and many others used to promote their often anti-OA vitriol, so a criminal investigation should take place to ensure the safety of that information. His hard-drive should be ceased by legal authorities.

    3. The community needs a white list AND a black list simultaneously. The DOAJ list is deeply flawed with many problematic entries, despite the 2014 purge.

    4. Whoever creates the lists must be industry neutral. No mainstream publishers, no interference from Editage, Publons, DOAJ, or any major publisher or rep from these industries. It should be an academia-run list: by the people, for the people. Enough special interests in publishing, especially OA which is the mega cookie jar of the next 5 years in publishing.

    5. Beall caused alot of damage and many used his flawed lists for official purposes. So, some papers deserve to be retracted, possible even Beall’s.

  7. Mike Taylor Says:

    Well, No sympathy, I think I disagree with every one of your points. To take them individually:

    1. It’s pretty certain the the disappearance of Beall’s list was due to legal threats, and quite likely that a bargain was struck where the case was dropped in exchange for Beall’s silence. It’s likely that if he spoke about it, he would be legally liable. Under those circumstances, he should not be compelled to offer an apology or an explanation. Honestly, the overnight disappearance of the site tells its own story.

    2. Nobody should ever be criminally prosecuted for stating facts (e.g. that a journal has claimed an impact factor when it doesn’t really have one) or indeed for stating opinions (even ill-founded ones against Open Access). This is pretty basic civics, and of course in the US is protected by the First Amendment.

    3. As I have argued, we need simple critical thinking skills far more than we need either kind of list — and where lists stand as a substitute for critical thinking, they may even be a net negative. meanwhile, if you have criticisms of the DOAJ, you should say what they are, not just hint darkly.

    4. First, I don’t agree that we need lists. But if we do: DOAJ is industry-neutral.

    5. Papers citing Beall’s list or using it as data have been perfectly clear about its subjectivity and other imperfections, and have invited readers to interpret their results with appropriate degrees of scepticism. Such papers certainly do not need to be retracted.


  8. Distinguishing predatory/questionable journals and publishers from sound ones may be relatively easy for researchers in the developed world, and should form part of all training in research and research communication. This isn’t the case, though, for researchers in the developing world, many of whom live in areas where there are conflicts and hardships of many kinds as well as lack of knowledge and good mentorship. INASP (http://www.inasp.info/en/), an international development charity working with a global network of partners in Africa, Asia and Latin America to improve access, production and use of research information and knowledge, reports that predatory/questionable journals are a serious problem for these countries, see eg
    http://blog.inasp.info/open-access-plays-vital-role-developing-country-research-communication/.
    The rise of predatory journals has also created problems for journals based in developing countries as they are being viewed more suspiciously. Legitimate journals there may not meet some of the criteria that would be expected of journals in the developed world. INASP is doing great work to address all these issues.

  9. protohedgehog Says:

    Thanks for this information, Irene, that’s really useful context here. Is there anything we can all do to assist the efforts of INASP and legitimate publishers in the developing world here..?


  10. Hi Jon,
    Generally, creating awareness of the issues and problems faced by researchers in developing countries, and taking them into account when talking about solutions to research communication/publication problems. For anyone wanting to help more specifically, I’m sure INASP would be delighted to hear from them, eg for the online courses or mentoring scheme – Andy Nobes would be a good person to contact http://www.inasp.info/en/staff/ .
    In my early days as a journal editor – pre-email/online – reviewing and author submission weren’t global. We couldn’t afford courier delivery, so sending manuscripts for review by post to distant parts of the world and getting reviews back took a long time, therefore reviewers there weren’t often used. Authors in those areas often couldn’t afford to prepare and send multiple copies of a manuscript with original photos. With the advent of the internet, reviewing and submitting became global, it would be terrible to move back to a more globally fractured research community with new hurdles affecting the disadvantaged.

  11. Fair Miles Says:

    It is indeed a strange system this we are in…

    I think I agree with everything what you wrote. In fact it seems like pretty common sense for someone “in the business” for some time, though it is clearly not for many, many others (particularly in these accelerated times in which cultural change lags behind “sudden” changes due to globalisation and disruptive technologies). I wouldn’t have used “stupid” for that, but I guess is just a matter of tone (or, maybe, of a different academic or worldview perspective).

    However, this basic principles that you recommend lead us to a conservative status where big journals (now run by big companies) are (the only ones) to be trusted because they were big and trusted in t-1, so they get bigger and more trustable in t+1. They may eventually be tempted to use that power to favor other businesses than open communication for the benefit of the whole of science. I mean, maybe…

    Open access was/is not favored by such conservative politics. New, better journals or publishers are forced to get in the same “game of prestige” (and cash flow) to belong in the system, or they will get into the vortix at the end of the academic ladder. Scientific societies must decide to languish or to survive feeding on commercial bait. Successful scientists tend to select and promote similar people perpetuating the rules to success. Are we really constrained by the Universe to honor Pareto and log-normals in every single application?

    And, still, I would recommend as you did. Strange, as I said…

  12. Mike Taylor Says:

    Fair Miles makes an important point:

    This basic principles that you recommend lead us to a conservative status where big journals (now run by big companies) are (the only ones) to be trusted because they were big and trusted in t-1, so they get bigger and more trustable in t+1.

    That is a real danger, and one that we do need to guard against as we seek to transform scholarly publishing into something fit for the 21st Century. (Seventeen years late, and counting.)

    In practice, though, I think this tends to work out. I think about the case of PeerJ, for example. I was keen to publish in it as soon as it was possible, even though it was new and had no reputation of its own, because being an open-access wonk, I knew who the founders were: Pete Binfield, previously EinC at PLOS ONE, and Jason Hoyt, previously technologist at Mendeley, back before it was acquired by the Evil Empire. And it doesn’t take many people to start taking such a venue seriously before it’s mainstream — as PeerJ has now very much become for vertebrate palaeontology.

    In practice, I think new low-cost open-access journals face exactly the same barriers to community acceptance as new high-cost subscription journals do — really, it’s all about how the community as a whole perceives them. Researchers new to a field need to get a feel for how the community they’re joining perceives journals. But wasn’t that always the case?

  13. AndrewD Says:

    The list is available on the Wayback machine here:-

    https://archive.is/6EByy

    I also have a copy on my hard drive.

  14. Bianca Kramer Says:

    Just wanted to add that in addition to DOAJ, there ‘s also QOAM, which attempts to make creating a journal database a crowdsourced effort. And at Publons for example, you can now filter journals on open peer review policies. No comprehensive solutions of course, and I agree with Mike the only truly comprehensive solution is critical evaluation skills.

  15. Bianca Kramer Says:

    Also, thanks Irene for your perspectives! May I ask: do you think that something like ThinkCheckSubmit is in itself useful in the context of developing countries as well (recognizing the need for more training), or is it in its approach and/or content too Western-centered, and if so, in what regard? (I can see some possible issues but really want to learn here)

  16. Fair Miles Says:

    Bianca asked (Irene?):

    May I ask: do you think that something like ThinkCheckSubmit is in itself useful in the context of developing countries as well (recognizing the need for more training), or is it in its approach and/or content too Western-centered, and if so, in what regard?

    Not Irene, but I guess it is a useful tool while building that critical approach, though maybe somewhat redundant (you first have to be trained/tutored to get to it, and it’s value decreases when you already have some experience). In that regard, it is not very different to the white lists that Mike criticizes (I am not that harsh with them).

    Main problem is (from my point of view in the category “Global South” & “Ecology”) that it is another piece of the same Catch-22 I meant previously. To follow its criteria will reasonably lead you to the big names, which are (mostly) subscription journals that many institutions/countries can/must not pay, or open-access journals whose APCs are out of scale for your institution/grant/currency (and, critically, whose high IF is *highly* favored by researchers and authorities in the local/global institutions providing grants and jobs). So, boldly, if you agree with the system, you are out of the system.

    Green and “platinum/diamond” (no APC) open access, or even cheap gold (which by their nature cannot pay for all the bells and whistles), all have the potential to break that conservative paradox and help develope the developing. Established powers and businesses will not agree, naturally… In this context, to broadcast the proliferation of predatory publishers and scams is very useful to instill fear, which is a well-known tool to keep things as they were “in the golden days” (make academic publishing great again!).

    Having said (all) that, I personally welcome tools like ThinkCheckSubmit, white lists as DOAJ, or open databases because they are informative and accesible, so they help in educating students and users. They will inform the reasonable way to do things in the system and grow your critical-thinking (what Mike, myself, and many others would recommend to their collegues and students wanting to stay or get into academy). But that same critical-thinking should tell you that the reasonable way to do things in the system will not change the system, nor the position you are supposed to occupy in it.

  17. Mike Taylor Says:

    “In that regard, it is not very different to the white lists that Mike criticizes (I am not that harsh with them).”

    To clarify: I am not against whitelists — I am just opposed to whitelists as a substitute for thinking (just as I am opposed to anything as a substitute for thinking).

  18. Mike Taylor Says:

    “… very useful to instill fear, which is a well-known tool to keep things as they were “in the golden days” (make academic publishing great again!)”

    Har!


  19. Most of the journals that were listed in the predator journal list was obvious and most scientist with some insight and credibility will avoid them. However, some of the journals are not so obvious. E.g. Oncotarget. It looks like a great journal, IF is good and editorial board is impressive. However the journals’ editorial and reviewing practice has been questionable, the self citation proportion is extremely high (24%) and publication charges are high. Beall’s list was a good starting point for an overview of problematic journals.

    The DOAJ “white list” is another way to check scientific journals. Here is almost 9500 open access journals listed after an application. It is worth to mention that Oncotarget does not figure on this list.

  20. Andy Nobes Says:

    Firstly, thank you to Irene for the mention and the good precis of some of the major issues for developing countries. As one of the founder members, INASP is completely behind the Think.Check.Submit initiative. Choosing a suitable target journal is obviously a vital step in research communication and dissemination – a point which we make regularly on our online courses – that this is something to be considered at the beginning of the research project; not as an afterthought. So it totally makes sense to ask researchers to critically examine journals/publishers and decide for themselves whether they are the most suitable outlet for their work.

    Some thoughts in a personal capacity…

    I agree with the sentiment of this blog post, but I think it’s a wee bit harsh and unfair on some researchers in the developing world. It’s very easy for elite publishing professionals like us to dismiss predatory journals as an amateurish scam (which they are), but I think sometimes we also need to bear in mind the other 99% (yes, I’m using an Occupy analogy) of people involved in the academic world who don’t spend most of their day thinking about the publishing industry and publishing ethics. In response to the charge that all researchers who publish in predatory journals are being ‘stupid’, I would refer people to Phill Jones’s excellent blog post on information inequality which has some counter-perspectives on this: https://www.digital-science.com/blog/perspectives/predatory-publishing-isnt-the-problem-its-a-symptom-of-information-inequality/.

    Some comments have been made that researchers should already possess the critical awareness to tell good journals from bad. However, I think there is a key difference between having a critical knowledge of the scientific method, and having the skills needed to judge the quality of academic journals – so skills like digital literacy, and a decent understanding of the (constantly changing) landscape of scholarly publishing. Not forgetting of course that many researchers in the developing world still lack access* to some/many academic journals in the first place, so their knowledge of the literature, (and therefore an understanding of their ideal target journals) can often be incomplete.

    One of the reasons INASP launched the AuthorAID project back in 2007 (plug: more news shortly on our 10-year anniversary) was because there are critical gaps in knowledge and skills for research communication: for example, not all institutions in the developing world have the capacity to include research writing skills as part of their curriculum.

    Over the last few years I’ve read quite a few CVs of mid-career and experienced developing country researchers who have been applying to join one of our initiatives as a facilitator or a mentor. We have some superb mentors from Latin America, Africa and Asia (we will be interviewing more of these guys on the AuthorAID website this year), but it’s quite surprising how often, when looking through the application of a well-qualified researcher, that predatory journals unexpectedly appear on their CV. Sometimes this might be two, three or even four-plus journals (mostly the usual suspects) that appear on Beall’s List – and this is from researchers who want to support and advise younger researchers, which is somewhat worrying.

    Beall’s List was highly flawed – it captured the main players in the predatory journal industry quite well but couldn’t keep up with some of the new arrivals on the scene, and I think it was too harsh on some genuine, but low-quality regional publishers which were borderline or deserved the benefit of the doubt, and yet were never re-assessed. Some entries on the list were documented and well-justified, while others had no explanation or background, and the reason for their inclusion was not obvious or transparent.

    As one of my Latin American colleagues told me last week, Beall’s List was useful to lots of researchers as a first reference-point, but it wasn’t the only thing they checked, and people were becoming more and more aware of the flaws of the project (and Beall’s agenda). The blog was often useful for me, because he pointed out the obvious flaws of individual cases, so I didn’t need to do any additional delving and had an explanatory link to provide to researchers. Similarly, I suspect that it was also useful for researchers to give them a grasp of some of the examples of poor practices and red flags they should look out for.

    As Irene has pointed out above, we are also concerned about the effect on good-quality local journals that provide an important outlet for a lot of developing country research – they are sometimes viewed as less trustworthy as a result of the predatory journal boom, especially as they tend to be Open Access, and some have even started charging small APCs.

    It’s often interesting to see the websites that small local journals set up themselves to market their journal to the rest of the world. INASP has helped support the creation of national portals for journals in countries like Bangladesh, Sri Lanka and Nepal. This hosts journals using the Open Journals System OJS software, but some also create their own additional website to promote the journal.

    I know one good-quality journal which was one of the first in its country to get the Green Tick on DOAJ. I’ve met the editor – he’s a keen Open Access and CC-BY advocate. However, the first iteration of their website and new journal cover was a real shock, making all the classic mistakes you would expect on a predatory journal website: flashy graphics, too many poorly-resized pictures, and the homepage (and journal cover) plastered with logos of every conceivable indexing service they had an association with (including Crossref and Google, for example). I knew this was a good journal, but the website was simply not credible, so we strongly advised them to clean up the site to avoid being mistaken for a predatory journal. This felt wrong (and somewhat neo-colonial) – ‘professional’ website design as we know it is expensive, and what’s wrong with creating a website which appeals to your target audience, in the style they are familiar with? (And to be fair, a splash of colour and flashing lights are used often in daily life in said country, especially when marketing a product.) I think we need to bear in mind that users from the Global South can sometimes have quite different experiences and expectations of ‘credibility’ on the internet, both as creators and users of content (and of course as consumers looking for a service).

    Going back to researchers, we hope and assume that tutors and supervisors can provide some basic training to their students on how to choose a target journal, but one of the common themes I’ve heard from researchers is that their supervisors and tutors tend to be very much in the ‘old school’ or ‘traditional’ mould of academic, with very limited understanding of digital publishing. Online and Open Access publishing is often badly misunderstood – even at the highest levels. Note, for example, the decision in 2015 of the Medical Council of India to refuse to recognise publications in ‘e-journals’ (without a print version) when counting publications for promotion: http://www.bmj.com/content/352/bmj.i344. There is of course a larger discussion around the role of institutional promotion committees creating more stringent criteria for research output and/or creating their own whitelists of journals within departments, but I won’t go into that here.

    Anyway, as we might have expected, the last couple of weeks has seen an increase questions on the AuthorAID Discussion list regarding predatory journals and conferences – these generally start with “is x a predatory journal?”, but the most recent post is from a researcher who accidentally submitted to a predatory journal on the basis that his tutor had published there (and he wants to know if he can withdraw the paper). If you want to get a feel for the kind of issues that are being discussed, feel free to register at https://dgroups.org/groups/authoraiddiscussion. You can just read posts, or you can join in with advice if you feel it would be appropriate.

    So there are a few things to consider from a developing country perspective. I guess what I am saying is that researchers, at all levels, still need training and support on good publishing practices so that they can apply their critical faculties based on up-to-date information. This includes resources like Think.Check.Submit, but I’m sure there is room for more educational initiatives. For example, there could be further guidance on how to critically examine a website, spotting bad publishing practices, or just more up-to-date information on the world of academic publishing, but in plain language (and not just in English, of course). There is also plenty of evidence that the first go-to source for academic information for developing country researchers is Google, which of course is a prime marketplace for predatory journals – a real leveller (equality isn’t always a good thing). Building researcher’s information literacy skills is probably something that librarians could help more with (see for example the resources from previous INASP workshops: http://www.inasp.info/en/training-resources/courses/118/ and http://www.inasp.info/en/training-resources/courses/127/ ).

    Re: Bianca’s reply – it’s a good point about Think.Check.Submit. potentially being western-oriented. I haven’t seen any specific feedback on this from the researchers we work with. The feeling I have got is that T.C.S is well regarded, and one of several resources that are being used. The last couple of researchers I have spoken to have mentioned that the video is very helpful.

    Finally, we regularly hear is that academic mentors are in short supply in the developing world. If you feel like you could provide early career researchers with help and advice on navigating the publishing process, and how to find a good journal, why not sign up as an AuthorAID mentor? http://www.authoraid.info/en/mentoring/ :)

    * Of course, this access problem is an issue that INASP, R4L and publishers continue to work together to improve.

  21. Mike Taylor Says:

    Thanks, Andy, for this extremely detailed comment! I have to respond to this, though:

    It’s very easy for elite publishing professionals like us to dismiss predatory journals as an amateurish scam (which they are), but I think sometimes we also need to bear in mind the other 99% (yes, I’m using an Occupy analogy) of people involved in the academic world who don’t spend most of their day thinking about the publishing industry and publishing ethics.

    First, as a point of information, I should note that I am about as far from being an “elite publishing professional” as it’s possible to be. I’ve never so much as been on the editorial board of any journal; and even as a researcher I am at the absolutely bottom of the totem pole, being an unfunded honorary postdoc with no aspiration towards a job in academia. In other words, I am very much part of that 99%, and probably in the bottom 10% of that.

    Second, and much more important, the thing about avoiding predatory journals is that it’s so, so easy to fail safe. You talk about “the skills needed to judge the quality of academic journals – so skills like digital literacy, and a decent understanding of the (constantly changing) landscape of scholarly publishing”. But you don’t need to understand the commercial implications of the Nature/Springer merger to recognise “This journal looks a bit dodgy, I’d better use one that my supervisor trusts instead”. The necessary skills can be taught in a single session — realistically, in ten minutes or indeed one minute if the student is paying attention. It’s not rocket science. It is exactly on a par with “if a Nigerian prince offers you $10M, don’t send him a thousand bucks to release the funds”.

  22. No sympathy Says:

    It’s astonishing to see the flawed Beall lists continue to be cloned and promoted, as if they were valuable:
    http://beallslist.weebly.com/

    They should be totally erased.

    It’s equally astonishing that Beall stays silent. When will he address the public?

  23. Mike Taylor Says:

    It is disappointing that Beall hasn’t said anything in public; but as I explained above, it’s hardly astonishing. It’s likely that his silence is part of a deal that he struck to avoid a potential lawsuit.


  24. Andy, that’s a terrific contribution that provides really valuable information and insight, especially as it’s based on your and INASP’s real-life experiences.

    Bianca, ThinkCheckSubmit is a useful concept, but I agree it’s rather Western (developed country)-centred and assumes knowledge of a number of terms and concepts, eg checking if journals ‘Are actively indexed in services that you use’. The checklist is rather minimal and no indication is given of the weighting to give to the different items. It would also be valuable to provide pointers to information that would be helpful to researchers who aren’t that knowledgeable, such as those in developing countries or graduate students who don’t have good mentors or work in relative isolation. I can’t see anything like this or suggested reading on the site.

    Most fake or suspect journals won’t be sending out review requests as they don’t carry out peer review, but based on Mike’s experience it seems that some are. I’ve felt for a while that a ThinkCheckReview checklist would be helpful, and not just for questionable journals but for all, so that researchers are aware of various things and can check what they’re signing up to – and are happy with this – before accepting to review.

  25. Andy Nobes Says:

    Hi Mike,

    Yes, the phrase ‘elite publishing professionals’ was a bit cheeky. Perhaps forgetting the ‘professionals’ bit for now, I’ve taken a look at your publication history and I’m pretty sure if you would be in the top 1% of researchers in terms of knowledge of publishing practices.

    What I am arguing is that a lot of ECRs (and their tutors in a lot of cases) do not have the basic background knowledge to say “this journal looks a bit dodgy”. I don’t think cognitive or critical thinking is a transferable skill because you need some hard background knowledge of what you are trying to critically examine.

    Researchers sometimes have too much trust in the idea of an academic journal. Think of anybody with poor digital literacy skills – I’ll give the example of one of my older family members. He is somewhat cautious and cynical about most things, but it took a long time to teach him the difference between an important windows warning message and a fake malware popup on his computer (which I can tell you is a difficult thing to teach, especially when these popups look very similar-looking, or call themselves ‘Windows warning’). This led to him *almost* being scammed by a fake IT company. He didn’t have the knowledge/experience to say ‘that looks a bit dodgy’ – he mostly trusted what the computer was telling him.

    I didn’t even get into the issue of language in my comment above. It takes me about 5 seconds to spot BS and filler in a journal aims and scope, a conference ‘about’ page, or a CFP email. But that’s because I’m a native English speaker with experience of the style in which these things are usually written and the formal conventions which are often used. Imagine what it would be like for a researcher with basic English skills who is looking for a journal to publish their first research paper, and whose supervisor doesn’t know much about the internet?

    Going back to your comment (on Twitter) that it should take a tutor ten minutes to explain how to find a journal – do you think this could be achieved through a 10 minute video? Perhaps building on the current Think.Check.Submit. video (or a TCS+) and providing specific examples, good and bad (using mock ups to avoid being sued)?

  26. Mike Taylor Says:

    Well, Andy, you make good points, especially about non-English speakers searching for an English-language journal.

    But I am still left with this: it’s easy to fail safe. Of course we don’t expect every researcher to be able to determine with 99% accuracy whether a given journal is legitimate or not. But we don’t need to be able to do that. For a typical working researcher, all they need is the ability to detect a questionable odour and say “This journal might be fine, but I’m not taking the chance, I’ll submit elsewhere”. A dozen false negatives don’t harm a researcher; all they need to do is avoid the false positive of deciding a dodgy journal is legit.

    And again I come back to the supervisors. How did they ever get the job of supervising researchers if they don’t know basic of their field like what the recognised journals are?


  27. Thank you all for the great discussion here, particularly regarding the Global South. My colleague Andy Nobes has shared the issues and challenges that we at INASP see faced by many researchers in low and middle income countries. Lack of awareness of publishing practices, lack of experienced colleagues and mentors and lack of English language skills are all very real barriers that can make it much harder for these researchers to distinguish between reputable journals and those seeking to deceive.

    Thank you also for the discussion about Think. Check. Submit. The desire to avoid a Western/Northern bias was part of the reason that INASP is one of the founder members and this is an important role that we play. This discussion thread has raised some great suggestions that the TCS committee will discuss at our next meeting. Please do get in touch with me or anyone else on the TCS committee if you have more ideas. Think. Check. Submit. is always interested in constructive feedback as we seek to develop the campaign in ways that are useful to researchers all over the world.

    On a more personal note, I would like to echo Andy’s call for people to volunteer as AuthorAID mentors (www.authoraid.info/en/mentoring). Experienced mentors can make a big difference in guiding early-career researchers in good research writing and publication practices. It is also immensely satisfying and enjoyable. As an example of the kind of help we can provide, you might be interested to read this recent blog post that I wrote based on a response that I gave to one of my mentees who asked advice about where to publish: http://www.authoraid.info/en/news/details/1153.

  28. YQ. Says:

    I came here from another post “Why did Beall’s List of potential predatory publishers go dark?”… My point: No matter “whitelist” or “blacklist”, a useful tool for us to find a proper journal fast is good. In fact, I added Beall’s list, DOAJ (https://doaj.org/), OA Publications’ Ranking (http://www.oalib.com/rank/showKeywordsOfJournal) and some OA publishers (such as OMICS, MDPI, Hindawi, SCIRP…) to my Favorites. However, when I need to find a journal for publishing my research paper, I still prefer to type keywords and search via Google.

  29. Lazza Says:

    Well, the list was not only used by researchers but also by students. For instance when I started working on my MSc thesis, the first paper I found looked interesting… and I was about to include it into my study material when I determined the journal was not reputable, thanks to the list.

    The funny thing though is that the bibliography of the first “paper” I found actually cited some papers published by much more respectable journals, so it was useful to find interesting material after all. :)

  30. Robert L.Vadas, Jr. Says:

    Certainly, I now look at Beall’s list whenever I examine a scientific journal for future, potential publication. But I also go to the journal website itself to examine a few papers. Telltale signs of predatory status includes misspellings in the abstract and lack of an ‘Acknowledgments’ section that addresses peer reviewers.

    -Bob Vadas, Jr.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: