Ruth’s Rankings 37 – Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?

By Ruth A. Pagell*

(21 October 2018)

  • A chaired professor at a distinguished British university is listed as the international advisor for a journal even though he has tried for years to remove his name.
  • A new online directory lists a major database vendor as a sponsor even after a “cease and desist” order
  • Many journals list impact factors on their websites even though the Impact Factor is “owned” by Journal Citation Reports
  • 80% of the journals in one blacklist are “Open Access.”

Ruth’s Rankings 37. Part 1 updated earlier articles on journal metrics and introduced new sources of metrics.  This article discusses what I will refer to as questionable publications. It reports on how various compilers of journal lists evaluate inclusion.  It contains Appendix 1 with sources of white (reliable), black (questionable), and gray (non-judgmental) publications and Appendix 2 with feedback directly from vendors. The article concludes with my opinion of the role of journal metrics and addresses roles that rankers, academic administrators and information professionals can play in eradicating this invasive species.

If you are affiliated with a university or have published a paper in an academic journal you have probably received emails asking you to submit a paper to a journal you may never have heard of, promising quick turnaround to publication for a fee that is often requested along with the manuscript.  The journal may have a real ISSN, assigned by the International Serial Number Centre, There are over two million ISSNs with 70,000 new ISSNs issued per year.  Or a publisher may make up an ISSN.

A cottage industry has evolved from publishers and journals to conferences, “impact factors” and a directory promoting many of these publications. Some of these journals are just bad science.  Some of the researchers are being duped while some are responding to the pressure of getting articles published.

There is no one authoritative list of all the characteristics of quality journals.  Existing lists of the publications included in research datasets, for example from Clarivate Analytics or Elsevier can be considered “white lists”.  They have editorial teams, entrance criteria and annual monitoring of journal behavior (Testa, 2018; Content…2018) Some similar characteristics are peer review, publishing standards, regular publication and international content. Scopus uses quantitative measures.

A title that is suppressed in JCR may still appear up to date in WOS. Scopus has several tools available listed in Appendix 1.

Many potential authors, especially those who are not affiliated with the world’s top universities, do not have access to the content of these sources, but they do have access to CA’s JCR  and Scopus journal lists. Professional associations, creators of databases that are available in subscription services, have free lists of the journals in the databases. For example, the U.S. National Library of Medicine has a list of all the journals in PubMed.

Google Scholar SHOULD NOT be used as a reliable source.  A journal about which there are no questions that it is predatory, has thousands of citations on Google Scholar. This is the journal that will not remove the name of its international advisor and says that it is in DOAJ and Cabells, when it is not.  I did not go looking for this example.  It came to me in one of those email requests to submit an article.

QUESTIONABLE PUBLICATIONS

There is no one list of questionable publications or publishers. The original list of predatory publications was Beall’s list, the work of the scholarly communications librarian at the University of Colorado Denver. Jeffrey Beall’s 2012 article in Nature introduced the topic to the scholarly community.  The list was pulled from the web in 2017 but back-up copies remain.  For a detailed list of white, black and gray lists, see Appendix 1. The journals may profess to have peer review but then promise a few weeks turn-around-time to publication.  They may ask for payment up front or not tell the author what the payment will be until the article is accepted.  If they have an editorial board, the editors may not know they have been included or all the editors will come from a few institutions you may not have heard of.  There may be no editor’s name or the same editor for many journals by the same publisher.  More details are in Appendix 1.

Many organizations, including DOAJ (Directory of Open Access Journals), ISSN and Springer Nature recommend this do-it-yourself help site, Think, Check, Submit  to determine whether or not to use a journal unfamiliar to you.

Open access and questionable publications

No discussion of questionable journals can avoid a reference to the Open Access model of journal publishing. Olijhoek, the edit-in-chief or of DOAJ and Tennant (2018) founder of the Open Science MOOC are concerned that focus on predatory publishing defames open access.  I wanted to separate the two issues since being open access should not imply that a journal is predatory. However, open access has by definition, opened up the possibilities of disseminating “scholarly literature” beyond the boundaries of traditional databases. If 80% of Cabells Black List includes open access publications, it does not mean that 80% of open access journals are questionable. Open access is not more the cause of questionable publications than Facebook is the cause of fake news. The internet is the facilitator, providing minimal barriers to entry and minimal oversight

In October 2018, DOAJ listed 12,198 journals.  At the same time, Cabells blacklist, which is still being complied, has over 7,600 open access publications.  Many journals are on neither list.

For 2017-2018 11% of the articles in InCites are Open Access as are 6% of the journals.  11% of all of the titles in the current Citescore are open access. While there is no guarantee that a few predatory journals will find their way into these databases, the risk is small. As more funders and the EU are pressing for all open access models by 2020 the predators may be a step ahead.

We do not want to be censors of poor scholarship, but we do need mechanisms to identify and publicize both quality output and fraudulent practices. This is where metrics from our well-known metric leaders, such as CA, Elsevier, CWTS and SIR come into play.  In addition to their lists of journals with citation-related metrics and inclusion criteria, CA and Elsevier have lists of journal titles that have been suppressed. We also have to consider the contribution of newcomer Dimensions, who adds a more modern approach to trying to measure value but with more latitude for questionable titles.

Citation Metrics and Questionable publications

Journal Citation Reports, the owner of the ONLY official Journal Impact Factor (JIF), uses its own inclusion criteria based on citation behavior.  The JIF team differs from the Web of Science team.  Note that Elsevier (CiteScore), CWTS Leiden (SNIP), Scimago (SJR) and Dimensions (FCR) use different designations for their journal impact scores. Depending on the subject matter, metrics such as percent of documents cited may be more important than the impact factor.

For authors without subscriptions to JCR, individual journals from publishers may list their JIF score.  Examples are Wiley publications and Elsevier journals, which also have their SciVal metrics.

See Appendix 1 for white lists, black lists and gray lists, such as Ulrichs, which relies on ISSNs and WorldCat, which pulls records from library catalogs.

See Appendix 2 for quotes from representatives from CA, Elsevier, Dimensions, Ulrichs and Cabells.

CONCLUSION

As I spent more time researching this topic I realized that this is a gray area in terms of what is predatory and whether it matters.  Is it overrated (Olijhoek. and Tennant 2018) or a problem that cannot be ignored (Anderson, 2018)?  Are the authors young and inexperienced (Xia, 2014) or are experienced researchers also appearing in these publications (Perlin, Imasato, & Borenstein, 2018)?

In our current intellectual – or anti-intellectual – environment, it is important for all players – publishers (Chandrakumar, ‘tJong & Klassen, 2018), editors, authors, administrators and librarians– to maintain scholarly standards.  Librarians contributed to the problem through their enthusiastic embracement of open access as a way to control journal costs and to limit the constraints of copyright. Librarians can now contribute to containing the problem. They can apply the same criteria they used for selecting subscription publications to open access models and educate their users on how to identify questionable publications.  For authors under pressure to publish, the easy barriers to entry of quick turn-around and superficial, if any, peer review, makes open access appealing.   Administrators need to realize that more articles do not mean better articles.  While many of our ranking and directory sources do not want to be arbitrators of what is predatory or just bad science, they too have a responsibility to set standards.

Publishers who make a profit from intentionally manipulating the system, distort the scientific record” (N Quaderi, Oct 2018)

Journal metrics, with all their known flaws, still play an important role in identifying quality publications.

Alecci, S. (20 July 2018). New international investigation tackles ‘fake science’ and its poisonous effects.  International Consortium of Investigative Journalists Blog accessed at  https://www.icij.org/blog/2018/07/new-international-investigation-tackles-fake-science-and-its-poisonous-effects/

Altbach, P.G. & de Wit, H. (7 Sept 2018).  Too much academic research is being published. University World News. (519) accessed at http://www.universityworldnews.com/article.php?story=20180905095203579

Anderson, R. (7 Aug 2018). Denialism on the Rocks: It just got a lot harder to pretend that predatory publishing doesn’t matter.  Scholarly Kitchen accessed at https://scholarlykitchen.sspnet.org/2018/08/07/denialism-rocks-just-got-lot-harder-pretend-predatory-publishing-doesnt-matter/

Beall, J. (12 Sep 2012).  Predatory publishers are corrupting open access. Nature 489 accessed at https://www.nature.com/news/predatory-publishers-are-corrupting-open-access-1.11385

Beatty, S.  (24 October 2016).  Is a title indexed in Scopus? A reminder to check before you publish. Scopus Blog https://blog.scopus.com/posts/is-a-title-indexed-in-scopus-a-reminder-to-check-before-you-publish

Beatty, S: (12 June 2017).  Re-evaluation: Scopus checks and balances: Maintaining quality content on Scopus.  Scopus Blog https://blog.scopus.com/topics/re-evaluation

Chandrakumar, A, ‘tJong, GW & Klassen, T.P. (Oct 2018).  The role of mainstream publishers in eliminating the threat of predatory publishing.  Canadian Journal of Diabetes. 42(5) 457-458 accessed at https://www.canadianjournalofdiabetes.com/article/S1499-2671(18)30694-4/fulltext

Content policy and selection (2018).  Accessed at https://www.elsevier.com/solutions/scopus/how-scopus-works/content/content-policy-and-selection

Crotty, David (14 August 2018).  Revisiting: Six years of predatory publishing. Scholarly Kitchen accessed at https://scholarlykitchen.sspnet.org/2018/08/14/revisiting-six-years-predatory-publishing/?informz=1

Davis, Phil (8 Oct 2018).  Tipping the scales: Is impact factor suppression biased against small fields?  Scholarly Kitchen  accessed at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5487745/

Eriksson, S. and Helgesson, G. (2017).  “The false academy: predatory publishing in science and bioethics”.  Medicine, Health Care, and Philosophy 20(2) 163-170 accessed at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5487745/

Frandsen, T.F. (2017).  “Are predatory journals undermining the credibility of science? A bibliometric analysis of citers”. Scientometrics. 113, 1513-1528 accessed through ResearchGate.

Kokol, P., Završnik, J., Žlahtič, B.  and Blažun Vošner, H. (June 2018).  “Bibliometric characteristics of predatory journals”. Pediatric Research 83 1093-1094, article citation and references accessed at https://www.nature.com/articles/pr201854li

Olijhoek, T. and Tennant, J. (25 September 2018). The ‘problem’ of predatory publishing remains a relatively small one and should not be allowed to defame open access.  LSE impact blog

Pagell, Ruth A. (2014).  Insights into Incites, Online Searcher 38(6) 16-19 available through ResearchGate

Perlin, M.S., Imasato, & Borenstein, D. (2018).  Is predatory publishing a real threat? Evidence from a large database study.  Scientometrics, 116, 2555-273;  summarized  in World University News 21 Sept 2018 http://www.universityworldnews.com/article.php?story=20180918144241202’

Quaderi, N. (4 Oct 218).  Conversation (see more in Appendix 2).

Shen, C & Bjork, B-C (2015).  ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics.  BMC Medicine 13:230 DOI 10.1186/s12916-015-0469-2 or accessed at https://kopernio.com/viewer?doi=10.1186/s12916-015-0469-2&route=6

Steiginga, S. (17 April 2017) The Scopus solution to predatory journals, PowerPoint presented at Scopus International Seminar, Moscow retrieved at https://conf.neicon.ru/materials/28-Sem0417/170417_0930_Steiginga.pdf

Testa, James (26 June 2018).  Journal Selection Process, Clarivate Analytics, https://clarivate.com/essays/journal-selection-process/

Xia, J et al. (2014).  Who publishes in Predatory Journals?  Journal of the Association for Information Science and Technology, 66(7) 1406) – 1417, doi 10.1002/asi.23265,  accessed at https://scholarworks.iupui.edu/bitstream/handle/1805/9740/Xia_2015_who.pdf?sequence=1

THINK _ CHECK_SUBMIT

 Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A Look Back Before We Move Forward
  14. SciVal – Elsevier’s research intelligence – Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?
  29. From Bibliometrics to Geopolitics:  An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
  30. Hong Kong and Singapore: Is Success Sustainable?
  31. Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
  32. The Business of Rankings – Show me the money
  33. Authors:  Part 1 – People and processes
  34. Authors: Part 2 – Who are you?
  35. Come together:  May updates lead to an investigation of Collaboration 
  36. Innovation, Automation, and Technology Part 1:  From Scholarly Articles to Patents;     Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
  37. How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?

*Ruth A. Pagell is emeritus faculty librarian at Emory University.  After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674