By Ruth A. Pagell*
(18 September 2018) During decades of working with faculty in competitive universities I was always concerned by “publish or perish”, especially when junior faculty were pressured to publish in ”A” journals. Now, thanks to the proliferation of possibly predatory publishers, anyone can publish anything in a publication that calls itself a peer-reviewed journal. What role can journal metrics play in sustaining the quality of scholarly research?
We introduced journal citation metrics from JCR (Journal Citation Reports) in Ruth’s Rankings 4. More details were presented in Online Searcher (Pagell, 2014). The January 2016 Ruth’s Ranking 16, The much maligned Journal Impact Factor, provides a detailed account of journal metrics for subscription JCR and Scopus and three free sources: SNIP (Source Normalized Impact per Paper per year), SJR (Scimago Journal Rankings) and Google Scholar. This article introduces two new sources of journal quality bibliometrics and revisits these metrics.
Since 2014 the scholarly conversations about faults with the journal impact factor have widened to include the threats from predatory publishers and journals. The latter and the answer to our question will be covered in Part Two of this article.
JOURNAL METRICS – FREE SOURCES
New free sources include Elsevier’s CiteScore and Digital Science’s Dimensions.
CiteScore
Elsevier introduced CiteScore at the end of 2016 as a free competitor to JCR with a limited number of publishers. Today’s CiteScore lists over 25,000 active publications.
Elsevier provides useful supporting documents for CiteScore. Scopus content and Scopus Content Coverage Guide list the sources that are used in CiteScore and explain the different data types. Elsevier also released a new Research Metrics Guidebook in August, 2018.
CiteScore metrics include:
- CiteScore calculates the average number of citations received in a calendar year by all items published in that journal in the preceding three years.
- CiteScore Percentile indicates the relative standing of a journal in its subject field
- CiteScore Rank and Rank Out Of indicate the absolute standing of a journal in its field; for example, a journal is in the 99th percentile overall and two out of 157 in its field.
- Documents is the denominator of the CiteScore calculation.
- Citation Count is the numerator of the CiteScore calculation.
- CiteScore Tracker forecasts a source’s performance for the upcoming year. CiteScore is released annually; Tracker updates specific sources by clicking on their titles.
CiteScore includes SNIP and SJR rankings for comparison.
Digital Science released Dimensions in January 2018. Dimensions has many layers. We look at this new product only in relationship to journal metrics. Dimensions creates its own dataset with over 97,000 publications as of September 2018. It continuously harvests documents at the article level from various sources, including Crossref, PubMed, Europe PubMed Central and biorXiv. Dimensions has publications not found in Elsevier or WOS products. Dimensions created its own impact measurements, uses a variety of international journal lists and adopted an Australian/New Zealand subject structure, as explained in the Guide (Bode, et. al, 2018). Citation metrics applied at a publication and source level are:
· Relative Citation Ratio (RCR) indicating Relative citation performance of a journal compared to the citation rate in its area of research. A value of more than 1.0 shows a citation rate above average. The area of research is defined by the articles/journals that have been cited alongside it. The RCR is normalized to 1.0.
· Field Citation Ratio (FCR) indicates the relative citation performance of a journal, when compared to similarly-aged articles in Field. A value of more than 1.0 indicates higher than average citation, when defined by Subject Code, publishing year and age. The FCR is calculated for articles published in 2000 and later.
Other journal metrics include number of total publications and Altmetric Mentions and Attention Score, a weighted count of all of the online attention. More detail on the attention score is available here.
Publications, FCR and RCR are available for journals at the field level. Defaults for sorting or analysis are by number of publications. See Example 37.1. for metrics for a journal with multiple subject codes compared to overall metrics for the subject. Table 37.1. shows Dimensions top 10 sources for publications, RCR and FCR. The top ten lists for the three metrics result in 28 different sources. Note the difference in rankings between the rankings for number of publications and the citation metrics.
[pdf-embedder url=”https://librarylearningspace.com/wp-content/uploads/2018/09/Example-37.1-Journal-citation-metrics-compared-to-its-fields.pdf” title=”Example 37.1 Journal citation metrics compared to its fields”] [pdf-embedder url=”https://librarylearningspace.com/wp-content/uploads/2018/09/Table-37.1-Dimensions-Top-10-Metrics.pdf” title=”Table 37.1 Dimensions Top 10 Metrics”]
Access to the database is free. To download or link to an ORCID account, registration is necessary. Additional metrics for patents, grants, clinical trials and policy documents are available for a fee.
SNIP and SJR are free journal metrics from CWTS (Leiden Rankings) and Scimago (Scimago Institutions Rankings). Both use Scopus data and the scores are displayed in CiteScore. Some of the scores also appear in Dimensions. Scholar metrics use the publications in Google Scholar.
SNIP (Source Normalized Impact per Publication) is calculated from 1999. The 2017 list has over 24,000 publications. The Methodology: includes four metrics : P – number of publications for a source for past three years, with a minimum of 50; IPP – similar to the Journal Impact Factor but using three years; SNIP – which normalizes IPP by correcting for differences among citation fields; and percent of self-citations. It covers articles, conference papers and reviews. Search for title, ISSN or publisher; limit by main subject area and sub-area; only 1,000 sources are displayed. Download the entire dataset of almost 350,000 records.
SJR (Scimago Journal Ranking) also has data from 1999. Search by title, ISSN or publisher. It has 34,000 titles. In addition to SJR It has eight other sortable metrics: H Index (for four years); Total Documents for ranking year; Total Documents for the previous three years; Total References (current year): Total Cites (3 years); Citable Documents (3 years); Cites per Document (2 years); and References per Document (current year). You can limit by subject areas and subject categories and by country of publication. SJR’s top journal from the Asiatic region is Fungal Diversity. The user can limit by WOS journals or open access journals. Playing with SJRs visual displays I learned that in 2017 17.6% of the Asiatic region documents were open access compared to 12.2% for North America.
Google Scholar Metrics ranks journals based on their citations in Google Scholar using the h5-Index for the scoring and a five-year h-Index mean. It lists the top 100 journals overall, top 20 in each of eight fields and then top 20 in almost 300 subcategories. Metrics are gathered from a crawler. Scholar focuses on articles not journals. Inclusion in Google Scholar metrics are listed here: (https://scholar.google.com/intl/en/scholar/inclusion.html)
JCR – SUBSCRIPTION SOURCE
First released in 1975, the 2018 version of JCR includes a new journal profile page with more visuals and data, for example the top five citable items per publication. JCR has the fewest number of journals but the largest numbers of metrics and user options. Its Journal Impact Factor (JIF) was the first attempt to measure journal quality and remains in the forefront of quality metrics. It also incorporated the Eigenfactor, available freely on the web. A journal’s Eigenfactor score is the measure of the journal’s total importance to the scientific community. Journals are “influential” if they are cited often by other influential journals. Also search by subject category and country of journal origin. Other sources let you select a subject but do not rank subjects by output or impact. Click here for basic JCR information.
Some of the information, such as individual journal impact factor, collaborating countries and organizations and open access publications are available in Web of Science. WOS now also links to free full text versions of articles in repositories and other sources such as ResearchGate. The complete 2018 journal list is available.
CONCLUSION
Table 37.2. (Excel version here) compares top sources across six metrics and four different datasets. Scopus data is used by CiteScore, SNIP and SJR: Clarivate Analytics data for JCRs’ JIF; Google Scholar data for Google Metrics and Dimensions dataset for their FCR. Only CA: A Cancer Journal for Clinicians ranked first in three out of the four data sources. Note that there are only three common top ten sources from the rankings using Scopus data. Check Table 37.2. for more comparisons. The data show certain trends, such as the predominance or medical and biomedical sources and Nature as a publisher. Therefore, it is important to limit by subject to view other fields.
Table 37.3. highlights the characteristics of each of the sources. It is important to remember that calculation methodology, inclusion criteria and dates vary. The resulting impact scores and rankings are relative to other publications in the individual databases.
This article focuses on bibliometrics. The scientometricians are also researching alternative measures, such as appearance in social media, views or downloads and access in open sources such as Mendeley Research Gate
Does being indexed on any one of these lists mean that the journal is not possibly predatory? Find out in upcoming Article 37 Part Two.
RESOURCES
Beatty, S. (24 Oct 2016). Is a title indexed in Scopus? A reminder to check before you publish. Scopus Blog accessed at https://blog.scopus.com/posts/is-a-title-indexed-in-scopus-a-reminder-to-check-before-you-publish.
Bode, C. et al. (Jan 2018). Dimensions Report: A guide to the Dimensions Data Approach. Digital Science accessed at https://figshare.com/articles/A_Guide_to_the_Dimensions_Data_Approach/5783094.
Bouchierie S and McCullough (31 May 2018). CiteScore metric updated with 2017 annual values access, accessed at https://www.elsevier.com/connect/citescore-metrics-updated-with-2017-annual-values.
Hook, D. Porter, S.F. and Herzog, C. (23 August 2018). Dimensions: Building context for search and evaluation. Frontiers in Research Metrics and Analytics. Accessed at https://www.frontiersin.org/articles/10.3389/frma.2018.00023.
Pagell, Ruth A. (Nov/Dec 2014) Insights into Incites: Journal Citation Reports and Essential Science Indicators. Online Searcher. 38,6, 16-19.
Schonfeld, R.C. (15 Jan 2018). A new citation database launches today: Digital Science’s Dimension. Scholarly Kitchen accessed at https://scholarlykitchen.sspnet.org/2018/01/15/new-citation-database-dimensions
Thanks to Christian Herzog for giving me access to the full Dimensions and his time explaining the database.
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674