By Ruth A. Pagell*
(20 May 2020) Journal quality has been measured by output, from counting pages and articles to calculating normalized citation impact factors. The Center for Open Science (COS) created TOP Factor as a way to measure journal quality through journal publishing guidelines.
There are four aspects of TOP Factor to consider:
- The underlying concepts of Transparency and Openness, including a new vocabulary, adapted from the 2015 TOP Guidelines
- The participants in the project
- The presentation of the information
- The integration of TOP Factor into Clarivate Analytics’ (CA) Master Journal List (see Appendix 45-1)
Output metrics are the basis for journal evaluation. They measure journal impact by citations, by subject category relative to number of publications and may include additional metrics.
Journals strive for inclusion in Journal Citation Reports to receive JCR’s Journal Impact Factor (JIF) 1. Free journal output evaluation tools using Scopus data include Elsevier’s Citescore, SNIP, (Source Normalized Citation Impact from CWTS Journal Indicators), and SJR (Scimago Journal Rankings). JCR has about 11,500 current titles and SCOPUS has about 23,000 active journals.
Ruth’s Rankings 3 (2014), 16 (2016), and 37 (2018) provide background for this article.
In February 2020, COS rolled out TOP Factor (Transparency and Openness Promotion). It applies qualitative input categories based on published publication guidelines for journals. Journals receive RATINGS.
The differences between RATINGS and RANKINGS:
• Journal rankings compare journals based on a set of indicators calculating a specific score that determines the ranking order • Journal ratings assess the journals against standards and a score may be given, much like a grade. TOP uses 1, 2, or 3; U-Multirank rates universities from A to D. •QS further clarifies the difference: Rankings ask, “Who is the best?” while ratings ask, ”Who is good at what?” (Linney). See Exhibit 45-1 (in pdf) |
TIMELINE – From DORA to TOP Factor
TOP Factor is the outcome of initiatives started in 2012 to find alternatives to the Journal Impact Factor (JIF).
2012 Scholarly journal editors and publishers attending The American Society for Cell Biology (ASCB) conference expressed the need to find alternatives to JIF (Mudditt, 2020).
2013 The release of DORA, the San Francisco Declaration on Research Assessment, was the follow-up from the meeting.
2014 A meeting organized by the Berkeley Initiative for Transparency in the Social Sciences, Science Magazine, and COS led to the formation of the Transparency and Openness Promotions Guidelines Committee.
2015 With the first release of TOP Guidelines in April, Science published an unbiased explanation of the need for new policies and introduced TOP Guidelines (Nosek, et.al. 2015).
2018 The most current Guidelines were released. (DeHaven 2018).
2020 In February, the TOP Factor dataset is formally released with 250 titles. 358 journals are rated as of May. Additional journals will be added throughout the year. In May TOP Factor is integrated into CA’s Master Journal List. (See Appendix 45-1).
CONCEPTS: Transparency and Openness Promotion Guidelines
Definitions of the Guidelines, called Standards in TOP Factor, present new jargon and perspectives. It is not about the journals’ published articles but about how journals’ publishing guidelines promote transparency, openness of content, and data for replication. (Mellor, Nosek, and Pfeiffer).
Standards are applied at one of three levels:
- Level 1 recommends use of a standard;
- Level 2 requires adherence to the standard; and
- Level 3 requires and enforces adherence to the standard.
- “0” indicates not implemented.
Journals choose the standards to follow and the level of compliance. See Exhibit 45-2 (in pdf) for the summary table or view it online at summary table for the Guidelines (in pdf). TOP Factor uses slightly different wording for the Standards and includes two additional categories, Registered reports and Open science badges, as shown below.
TOP STANDARDS (From TOPfactor.org) | TOP Guidelines |
1.Data Citation | 1. Citation |
2. Data transparency | 2. Data transparency |
3. Analysis code transparency | Analytic methods (code) transparency |
4. Materials transparency | Research materials transparency |
5. Design and analysis guidelines | Design and analysis transparency |
6. Study preregistration | Preregistration of studies |
7. Analysis plan preregistration | Preregistration of analysis plans |
8. Replication | Replication |
Registered reports | Not an original guideline- peer review before results are known |
Open science badges | Not an original guideline – used by 67 journals to reward individual authors |
PARTICIPANTS: Publishers and Journals
Over 5,000 journals are Guideline signatories representing over 60 publishers. The Springer Nature Group imprints and Elsevier comprise 66% of the list.
The TOP Factor list (as of 16 May 2020) includes 358 journals and 37 publishers.
- Not all of the publishers on the TOP Factor list are signatories.
- For publishers on both lists, the number of journals is not the same.
About 45% of TOP Factor journals are related to Psychology with APA as the leading publisher. TABLE 45-1 (in pdf) lists the publishers, most of whom are well-known scholarly publishing houses. TOP Factor plans to have 1,000 journals by the end of 2020. Journals and publishers were added as I researched the article.
Subjects and Ratings
Buried in a policy statement, COS mentions that its areas of focus: are psychology, education, and preclinical biomedical sciences. Journals are categorized in one or more of 34 disciplines, ranging from one journal in Statistics to over 140 in “Psychology.” Psychology journals are also categorized in five more specific disciplines.
COS recognizes that scores are most meaningful when used to compare journals in similar fields. Since Top Factor has no taxonomy I checked the subject structure used by prominent bibliometric data users and providers. See Exhibit 45-3 (in pdf).
I then mapped TOP Factor disciplines with existing categories. See Table 45-2 (in pdf). With the new link to CA’s Master Journal list, having matching subjects becomes more important.
The OA journal, Meta-Psychology, has the highest rating: See Table 45-3 (in pdf) for the TOP 10 journals. Nine are in psychology.
I compared the TOP Factor journals included in “Experimental Psychology” with JIF and CiteScore. Most are included in JCR and CiteScore. Fewer than 40% were categorized as Experimental Psychology. See Table 45-4 (in pdf) for the TOP 10 from each list. Nature Human Behavior, Psychological Science, and Behavior Research Methods are top ten on all lists.
PRESENTATION: TOP Factor Online Access
TOP Factor’s website is basic. The TOP Factor list is available for downloading, but there is no indication on topfactor.org that this is an option. Since the list is not dated, I did not know it had been updated. I would like subjects to be listed with the journals.
According to COS’ Executive Director Brian Nosek “the primary purpose of releasing the TOP Factor is to provide journals with information about their peers so they can improve their own policies.” (Mellor, Feb 2020). This might explain the lack of concern over the functionality of the dataset for a wider public.
CONCLUSION
A journal can meet all Standards. It does not reflect the quality of the articles in the journals. “TOP Factor measures the extent to which journal policies facilitate the ability to evaluate the quality of the research. It is important to remember that research can be completely transparent and terrible at the same time (Mellor, New Measures)
I realized I had to look beyond the structure of the database to see the importance of a complementary way to measure a journal’s quality by its commitment to open scholarship.
I originally ended the article with the following: “A better evaluation system for journals could combine both a rating of transparent, open and reproducible results with existing journal bibliometrics”. Little did I know that such an option was in the works.
UPDATE: CA will be expanding its metrics in June by extracting Open Access citation counts. Look for more in second quarter update in June.
1 Although JCR is a subscription product, the JIF is available in Master Journal List.
Thanks to David Mellor and Alex DeHaven for answering my questions.
REFERENCES
Chawla, D. S. (7 April 2020). Journal peer review policies attract further scrutiny: Free tools give greater clarity to authors. Nature Index Blog accessed at https://www.natureindex.com/news-blog/journal-peer-review-policies-attract-further-scrutiny
DeHaven, A. (2018). Guidelines for Transparency and Openness Promotion (TOP) in journal policies and practices: “The TOP Guidelines”, v 1.01. Center for Open Science Wiki, posted 01 Apr 2018 at https://osf.io/9f6gx/wiki/Guidelines/. I checked with Alex DeHaven in May 2020 and these are the most current guidelines.
Linney, S. (27 Apr 2020). The differences between rankings and ratings. QS blog post accessed at https://www.qs.com/differences-between-rankings-ratings/.
Melllor, D. (10 Feb 2020). New measure rates quality of research journals’ policies to promote transparency and reproducibility. https://cos.io/about/news/new-measure-rates-quality-research-journals-policies-promote-transparency-and-reproducibility/
Mellor, D. (5 May 2020) TOP factor to appear in Master Journal list. Retrieved at https://www.cos.io/about/news/cos-and-the-web-of-science-collaborate-to-bring-TOP-factor-to-master-journal-list
Mellor, D. (29 Aug 2018). The landscape of open data policies, with link to November update. Center for Open Science blog retrieved at https://www.cos.io/blog/the-landscape-of-open-data-policies
Mellor, D., Nosek, B. and Pfeiffer, N. (21 Jan 2020). Conflict between Open Access and Open Science: APC are a key part of the problem; preprints are a key part of the solution. Center for Open Science blog, https://cos.io/blog/conflict-between-open-access-and-open-science-apcs-are-key-part-problem-preprints-are-key-part-solution/ – presents an argument for separating publishing, through preprints, from evaluation, with peer review.
Mudditt, A. (18 Feb 2020) Reforming research assessment: A tough nut to crack. Scholarly Kitchen https://scholarlykitchen.sspnet.org/2020/02/18/reforming-research-assessment-a-tough-nut-to-crack/
Nosek, B.A. et. al. (2015). Promoting an open research culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science, 348(6242):1422-1425 accessed at https://science.sciencemag.org/content/348/6242/1422 DOI: 10.1126/science.aab2374
Pagell, R A. (2014). Insights into Incites, Online Searcher, Nov/Dec2014; 38(6): 16-19.
Solomons, J. (2020). The TOP Factor: a new journal metric to assess transparency and reproducibility policies, 26 March 2020. The Publication Plan, accessed at https://thepublicationplan.com/2020/03/26/the-TOP-factor-a-new-journal-metric-to-assess-transparency-and-reproducibility-policies/
Woolston, C. (18 Feb 2020). TOP Factor rates journals on transparency, openness: New tool seeks to change editorial practices. Nature Index blog, accessed at https://www.natureindex.com/news-blog/TOP-factor-rates-journals-on-transparency-openness
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674