By Ruth A. Pagell1
(13 October 2014) Ruth’s Rankings 3 introduced the metrics used in rankings. In this article, we look at the information companies who provide the underlying data.
As of October 2014, UlrichsWeb Global Serials Directory lists over 145,000 scholarly academic journals of which about 50,000 are not in English. Scopus estimates that there are 200,000 scientific serial publications in existence worldwide. A policy maker, funder, provost, research unit or department head trying to evaluate research output, cannot trawl through all of these publications for their analysis and rankings. While individual researchers may love Google Scholar for the number of citations it generates and while Google is trying to get into the scholarly analytics business, it lacks the authority and controls necessary for consistent meaningful counts. Therefore, the global ranking bodies turn to two major information providers for their data, Thomson-Reuters Intellectual Property & Science Division and Elsevier/Scopus.
This article describes them separately. It then compares them on the key factors that lead to differences in the rankings when using one or the other as the publication base. Both companies are part of multi-billion dollar multinational corporations with corporate histories dating back to the 19th century.
THOMSON REUTERS
THOMSON REUTERS is an information conglomerate formed by the acquisition of British news icon Reuters by Canadian information company Thomson in 2008. We are interested in their Intellectual Property and Science business, which grew out of the original Institute for Scientific Information (ISI). The Division’s core business is aggregating and analyzing scholarly publications.
Web of Science (WOS) is the source for T-R metrics. The WOS package includes different modules, covering different inclusion dates. A subscriber chooses the indexes and the number of years to purchase. WOS‘s analytical add-ons include Journal Citation Reports (JCR) and Essential Science Indicators (ESI). The Incites Platform incorporates JCR and ESI (Pagell (2014) “Insights into InCites” forthcoming in Nov/Dec Online Searcher). If you subscribe to WOS for your internal evaluations and benchmarking, your results may not match the data used for rankings. Data used for rankings and in the analytics modules include different subsets of the WOS records.
Table 1: Example of WOS subscription choices
In addition to the Core Collection, WOS is adding separate national scientific indexes to its platform, with different levels of integration with the Core Collection. SciELO “covers 650 open access journals from Latin America, Spain, Portugal, the Caribbean and South Africa and is free to all subscribers.”
The Chinese Science Citation Database (CSCD) is a separate subscription. 12-month trials as an extended preview are available to all customers outside of China. The journals covered by CSCD are a mix of paid subscription and open access. Citation counts from these two databases appear in the WOS (see Figure 1).
All WOS customers will have access to the KCI (Korea Citation Index) by the end of October 2014. This coverage is also a mix of subscription and open access journals.
According to Michael Takats, Director of Product Strategy, “the Russian Science Citation Index will be added to the platform in 2015. Pricing and distribution have not been finalized. We expect this will also be automatically added to all WOS accounts in certain regions. It also covers both types of journals–subscription and open access.“ (email, 9 Oct 2014).
Figure 1: Example of Integrated Citation Each WOS record includes the aggregate citation count for that article in all databases that are part of the Web of Science. It is only the count for the core collection that is included in citation analysis.
Scopus
The Web of Science Citation Indexes were the only game in town for over 40 years. Elsevier’s Scopus, which entered the field in 2004, is a serious commercial competitor. Elsevier is a scholarly publishing company, part of Anglo Dutch Reed Elsevier, headquartered in both the Netherlands and UK. These two major 19th century publishing companies, Reed and Elsevier, merged in 1993.
Scopus comes as an inclusive package with four subject areas: life sciences, health sciences (including 100% of Medline coverage), physical science and social sciences and humanities. Although Scopus launched in 2004, full coverage started with 1996 and Scopus continues to backfill. It includes not only scientific papers but also books and conference proceedings. It has built-in author and organization analytics and a limited module comparing up to ten journals. Figure 2 is an example of an affiliation analytic (downloaded September 2014).
Figure 2. Scopus Affiliation Analytic
JOURNAL SELECTION POLICIES
The differences in journal selection policies influence the differences in the research component of rankings.
Thomson Reuters Journal Selection Process
Thomson Reuters has an internal editorial board, comprised of subject specialists who are familiar with the existing content and policies. Basic evaluation standards include:
- Timeliness – publishing according to announced schedule
- International editorial conventions – including full bibliographic information
- Full text English or at least bibliographic information in English. A&HCI and SSCI have exceptions to the English policy. All journals must have citations using the Roman alphabet
- Peer review
- Editorial content – for new journals does it add to the science knowledge base
- International diversity – among the journal’s contributing authors, editors, and editorial advisory board members. However, a small number of regional journals are also included
- Citation analysis – every citation, from every article, whether or not the cited source is in Web of Science
WOS currently has over 12,000 journals. According to Jim Testa (2012), VP Emeritus Editorial Development & Publisher Relations, in a discussion of Thomson Reuters journal selection policy, “comprehensive does not necessarily mean all-inclusive.” WOS accepts about 10-12% of the approximately 2,000 journals evaluated annually.
Scopus Journal Selection Policy
Scopus uses an independent review board, the Content Selection and Advisory Board (CSAB) to recommend new journals for inclusion. The content overview website lists the chairs for the represented disciplines. Scopus’ minimum criteria for consideration of new journals include:
- Peer-reviewed content and a publicly available description of the peer review process
- Regular publication basis and an International Standard Serial Number (ISNN) as registered with the ISSN International Centre
- Content that is relevant for and readable by an international audience, meaning: have references in Roman script and have English language abstracts and titles
- Publicly available publication ethics and publication malpractice stated
Table 2. Scopus Selection Criteria
Scopus has 21,000 active titles of which 20,000 are peer-reviewed journals, books and conference papers. It has separate selection policies for books and conference proceedings. Currently there are around 6.5 million conference papers from around 78,000 conference events included in the database. Conference selection is based on the relevancy and quality of the conference in relation to the subject field. Over 10% of the database is conference proceedings and for Computing and Information Science, over 60% of the records are proceedings.
For more detailed information consult the the Scopus Content Coverage Guide.
Scopus accepts a higher percentage of publications that are submitted for evaluation, which means that they continue to add at a faster rate than WOS.
To increase the number of Social Science / Arts & Humanities titles, it adds the WOS journal list for Social Science and Arts and European journal indexes to its own list. This means that the Scopus database, while newer than Web of Science, includes more titles per region and per subject. Table 3 compares coverage based on number of titles.
Table 3. Top 10 Countries Based on Journal Titles, Plus Top Asian countries Compiled from Scopus and Thomson Reuters journal lists searched September 2014.
JOURNAL CITATION ANALYSIS
Publications are the underlying metric for all rankings, but not all publications are created equal. Raters want “high impact journals”. Journal Citation Reports (JCR) has been calculating journal impact as an add-on to WOS for over 40 years. External research organizations re-work Scopus data to create their own journal impact measurements, which are then incorporated back into Scopus.
JCR measures “journal impact” based on citations per article. The citation count is based on all items cited in JCR journals, even if they are not included in WOS. The process has been refined and the number of journals and metrics have increased, especially in the last decade. More details about JCR and its new platform will be available in the November/December Online/Searcher. Scopus citation count is based on all items cited by Scopus journals in Scopus journals. It has a built in add-on that compares up to 10 journals and metrics such as SJR (SCImago Journal Ranking by year) and SNIP (Source Normalized Impact per Paper per year). SJR uses JCR methodology for its SJR impact score. SCImago calculates SJR. CWTS (Leiden University’s Centre for Science and Technology) calculates SNIP. These two lists are available free online. We will examine these two rankings sources in depth in a 2015 column.
I tried to do a non-scientific comparison of journals in the Information Science/Library Science category for JCR and SJR. JCR has 83 titles and SJR has 205. Of these, only three were in the top ten of both lists: 12 journals categorized within Information Science/Library Science in JCR appeared in SJR’s computer science category. As noted in Ruth’s Rankings 3, the impact for an individual journal is dependent on the other journals in the field, the metrics and the number of years covered.
Table 4. Top 5 Library and Information Science Journals
If these titles are not part of your professional reading list, you are not alone. According to Nisonger and Davis (July 2005) in “The Perception of Library and Information Science Journals by LIS Education Deans and ARL Library Directors…“ College & Research Libraries, July 2005 66:341-377, the correlation of rankings between deans and JCR are moderate to weak and between library directors and JCR weak to non-existent.
CONCLUSION: Thomson Reuters continues to focus on what they consider the top tier journals based on citation count. An analysis mentioned in the Testa paper (2012) reports that 50% of all citations generated by the 2008 JCR collection came from only 300 of the journals. In addition, these 300 top journals produced 30% of all articles published by the total collection. Testa (2010) writes about the globalization of the Web of Science. It appears that the strategy is to look for top quality by adding national scientific citations databases without incorporating those non-English journals into WOS. The records are then not included in the ranking analytics.
Scopus has more journals from all parts of the world using similar selection criteria with different metrics for “quality”.
Personally, if I were evaluating scholarly output for an established Western university or policy body, I would prefer quality to quantity. If I were evaluating scholarly output for a rising Asian university or policy body, I would prefer the metrics with a greater representation of my publishers and authors.
The next article will compare two of the most popular commercial rankings, THE, Times Higher Education World University Rankings and QS World University Rankings.
*[Thomson Reuters sold its Intellectual Property and Science division., including its bibliometric
products to Clarivate Analytics in 2016- ed.]
References
Pagell, R. A. 2014. Bibliometrics and University Research Rankings Demystified for Librarians. Chen, C. and Larsen, R. (eds.) Library and Information Science: Trends and Research.(Open Access ) Bibliometrics and University Research Rankings Demystified for Librarians – Springer
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
1Ruth A .Pagell is currently teaching in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS. https://orcid.org/0000-0003-3238-9674