By Ruth A. Pagell1
(7 May 2015) There is a famous English poem called the Rhyme of the Ancient Mariner with a more famous line: ‘water, water everywhere nor any drop to drink’. Similarly, we are drowning in rankings. There are rankings, rankings everywhere nor any one that fits.
In this series we have looked at eight rankings’ providers, which include from 400 to almost 24,000 institutions. All have some common and many individual metrics but each one includes some metric using data from Web of Science or Scopus (see Ruth’s Rankings 4). All rankings have their limitations:
- Limited number of institutions
- Fixed time periods
- Annual updates
- A composite rank based on ranks but no scores or scores but no data
- A composite rank based on pre-set weighted averages
- Individual scores by indicator but no composite rank
- Inclusion by number of articles from fixed sets of publications
- Fixed regional comparisons
- Fixed and limited subject categories
- No downloading capability
- No customization
- Emphasis on science
- English language limitations
What do you do when none of these rankings suit your needs? If money is not a concern you can use InCites, a benchmarking and analysis module from Thomson Reuters which we will look at in detail in this article. Scopus does not offer a similar subscription service for its data.
We will re-examine the rankings that we have discussed in earlier Ruth’s Rankings articles to see if they allow for data manipulation.
DATA MANIPUTION FROM GLOBAL RANKING SOURCES
Table 10.1 Customization Characteristics of Rankings briefly compares the eight rankings sources plus Thomson Reuters InCites for number of institutions, basis for inclusion and customization characteristics. Most provide scores; Webometrics only provides ranks, which means you cannot see the differences between two rankings. Nature provides output data. Leiden uses data and provides for the downloadable access of its complete dataset. At the other extreme, Webometrics states in their methodology that they “only publish a unique Ranking of Universities…The combination of indicators is the result of a careful investigation and it is not open to individual choosing by users…” For more information about these rankings, check out their websites or the earlier Ruth’s Rankings articles listed in Table 10.1 .
Leiden Rankings
Anyone can access Leiden’s entire dataset as an excel file. The file includes all Leiden universities and all seven subject fields, with the underlying data for all publications, total number of citations, normalized citations and top ten percent for both raw and fractionalized data. The entire dataset is 35 columns by 12,000 rows since data are given for each institution, for each field, by total and fractionalized count. Leiden does not include an overall rank or weightings. Using the master dataset, you can design your own rankings based on what is important to your institution.
Example 10.1 is a simplified example, answering my question: Which are India’s best universities? Using data from the sixteen Indian universities in the Leiden rankings, I selected three indicators of quantity (size dependent). I assigned 50% to total citation count, 30% to total publications and 20% to international collaborations to calculate my own ranking, with the Indian Institute of Science ranking number one. Another possibility would have been to use any field that Leiden provides, such as the field “Mathematics, computer science and engineering”, where the Indian Institute of Technology (Kharagpur) ranks number one in output.
What if I were only interested in Math (not computer science or engineering) or my own category, university research rankings? See Example 10.1A Indian Rankings for additional Indian data from InCites.
Many rankings use Web of Science publications as the underlying data for their metrics. If your organization is not covered in the free rankings or you are interested in other subjects or weightings for rankings, you should consider InCites Benchmarking and Analytics to see how your institution compares to the other 5,000 plus institutions in the dataset. This is not an inexpensive product and it requires subscriptions but it overcomes many of the limitations listed above. See Table 10.2 for basic InCites’ characteristics and the attached InCites Indicators Handbook for detailed explanations of the scope and the metrics.
InCites expands the datasets used by the other rankings by including all document types and all datasets in the core collection:
Citation Indexes from 1980 (the basic set): Science Citation Index expanded (SCIE); Social Science Citation Index (SSCI); Arts & Humanities Citation Index (AHCI). Conference Proceedings from 1990: Science; Social Science & Humanities. And Book Citation Indexes from 2005: Science; and Social Sciences & Humanities.
I wrote a detailed article on “InCites Benchmarking and Analytics” for the January/February 2015 issue of Online Searcher, which is linked to this article. Because of all of the permutations, InCites itself can seem very complex when you begin using it. I will focus on its capabilities for customized rankings and not on its navigation and jargon.
InCites’ Benchmarking and Analytics is a goldmine for people who like data. It provides pre-formatted system reports and analytics. The pre-formatted reports are categorized as Research Performance and Collaborations and at an institutional level with visual displays, for those who are numerically inclined to accompany the underlying data. Each institutional profile includes a complete set of bibliographic metrics and supplemental institutional demographics such as faculty and student ratios and reputation rankings, where available. Figure 10.1 is a replica of the pre-formatted pie chart of the research areas for Mahidol University, Thailand.
With the analytic tools, you may create your own tables and limiters, choosing among five different modules and then save your own visual representations on a dashboard:
Organizations (over 5,000)
Regions (230)
Research areas (22 fields and 251 with a newly added capability to create your own specific research topics)
People (over 28 million, which unfortunately has name disambiguation issues)
Publications (over 150,000).
At the end of April 2015, InCites added a new capability. Subscribers to both WOS and InCites can search in WOS and import the results into InCites. For example, create your own topic, narrow a location to a city or state, and analyze books or select your own basket of publications.
Tables 1 to 3 in the Online Searcher article show indicators by module, customizing results and downloadable indicators. Table 10.3 (indicators by module) and 10.4 (downloadable indicators) are updates of two of these tables. They show almost all of the combinations available. Start in any module and limit by the others. Note that InCites provides size dependent metrics (total docs, cites, etc) and size independent metrics (percentage of).
Important limiters are number of documents and citations. Collaborating institutions with a few documents per category may affect the rankings. For example, in my personal search on university research rankings, California State College (Dominguez Hills) appeared as number one in percent documents cited; it had one document that was cited once.
With so many combinations, let’s simplify InCites with examples. More information about each example is attached.
Example 10.1A: Indian universities
Example 10.2: Benchmarking Thai and Malaysian universities
Example 10.3: Engineering
Example 10.4: Creating your own dataset
Example 10.1A: Compares InCites and Leiden rankings for Indian universities using the same weighting, and who is number one in Mathematics.
Example 10.2: Identify the top two universities in Malaysia and Thailand; select other Asian universities for comparison; include non-bibliometric indicators such as reputation; use InCites to rank these universities; compare InCites ranking to NTU-Taiwan world rank. See attached Example 10.2 for detailed steps, a screen shot of “benchmarking” and underlying data.
Which story would you like to tell?
Universiti Malaya is 470 in NTU with no subject rankings. Based on number of publications it is 382 in the world in InCites. By continuing to drill down to specific subject areas, discover that it has 52 highly cited papers in the research category, “Energy and Fuels” for the past ten years. That places it number 15 in the world!
Thailand is eighth in WOS publications for the Essential Science Indicator category Social Sciences and eight Asian countries. It is first in this Asian group in percent of documents cited, percent of documents in the top ten percent and first in impact relative to the world.
There is no specific category for Arts and Humanities. So limit a WOS search to only the Arts and Humanities citation database from 2004 for Thailand, Malaysia and India and import the result into InCites B&A. The top subject category is “Asian Studies.” Spending more time looking at times cited data for Asian Studies, the highest ranking for % of documents cited was 27% per country, with limits of 50 documents and 10 cites over a ten year time period. Using the same limits for the research area Chemical Engineering, the highest percent of documents per country was 85%.
Example 10.3: Engineering
Asian universities dominate the top rankings for engineering by publication and citation output. Change the metrics to size independent and quality measures, and the rankings change. Example 10.3 talks about the results when metrics are manipulated and is a bubble graph of universities with the highest reputations in engineering.
Example 10.4: Using the ultimate micro-ranking tool for university research rankings
The earlier rankings all have previously determined parameters selected by the ranking agency. With this tool, we can now search on any topic of our choosing.
I did a topic search in WOS for articles about university research rankings and imported the articles into InCites. I then used all of the InCites modules and indicators to analyze my results. Figure 10.4 (in Example 10.4) is a bar graph of the top research areas assigned to my articles.
CONCLUSION:
No matter which weightings and metrics we use, we see a consistent set of institutions appearing in the top lists. See Tables 10.5 and 10.6 for comparisons of selected world and Asian rankings.
If they can afford it, InCites is a valuable tool for policy makers, administrators and faculty at the majority of academic research institutions not covered by THE, QS, ARWU, NTU or Leiden, or for whom only scores are available in SIR. With the addition of importing searches from WOS into InCites, individual researchers can now create customized rankings of their peers and departments and research units can compare like institutions with similar narrow focusses.
I went off to college, many years ago, thinking I wanted to be a math major. After one semester, I knew it was not for me. But I envy the scientometricians their ability to work with the combinations provided in InCites to provide rankings and analysis on just about everything academic!
The best way to learn how InCites works is to see the YouTube tutorials. Two examples are listed below:
Organizations: https://www.youtube.com/watch?v=YMbqV5mdW7U&list=PLM1kuGdwRdGkEZ_bBSsQ0_18oNJh2GcrP&index=4
Identifying peer organizations: https://www.youtube.com/watch?v=orHx5NNF9_o&list=PLM1kuGdwRdGkEZ_bBSsQ0_18oNJh2GcrP&index=6
Pagell, Ruth A. (2015) Incites’ Benchmarking and Analytics. Online Searcher 39(1) pp.16-21 (attached with copyright permission; this article cannot be redistributed).
Pagell, Ruth A. (2014) Insights Into InCites: journal citation reports and essential science indicators. Online Searcher 38 (6) 16-19 (attached with copyright permission; this article cannot be redistributed).
From Thomson Reuters:
InCites 2.1 Indicators Handbook (2014) (attached)
* [Thomson Reuters sold its Intellectual Property and Science division., including InCites to Clarivate Analytics in 2016- ed.]
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
1Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS. https://orcid.org/0000-0003-3238-9674