Ruth’s Rankings 15: Analyzing 2015-2016 Updated Rankings and Introducing New Metrics

By Ruth A. Pagell*

(26 October 2015) Mid-August to mid-October is a busy time for those of us trying to keep up with the updates in global rankings. Five of our primary sources for composite rankings posted new rankings during this time period. Waiting until mid-year gives the rankers time to incorporate 2014 data.

THE World University Rankings use of data from Scopus rather than from Thomson-Reuters is the major change in methodology but it does not result in major changes in rankings.  Other minor changes in rankings may be due to changes in methodology by the ranker or by the underlying data provider.  For example, in 2014, Thomson Reuters changed its highly cited researchers list, which included lowering the number of researchers.

In Part 1 of this article, I update the changes in the composite rankings ARWU, QS, THE, U.S. News and NTU-Taiwan. For more in-depth information about each ranking, refer to the original Access articles. In Part 2, I introduce Reuters Top 100 World’s Most Innovative Universities and Thomson Reuters Web of Science “Usage Count.”


For each of the five rankers, I provide similar information and tables.  For those of you not interested in the individual rankers, I have created overview Table 15.1 and Table 15.2 , listing the top World and Asian universities for these five rankers, using their composite scores.  Only Harvard, Stanford and Massachusetts Institute of Technology (MIT) are in the top ten for all world rankings.  National University of Singapore (NUS), Peking University, University of Tokyo and Kyoto University are all in the top ten for Asia.   Table 15.3 provides more detailed information on the type of data you can find for an individual university including examples from THE and U.S. News.

The purpose of this and the articles that preceded it is not to tell you what your organization should use for its analysis but to provide the tools to select the most applicable ranking sources. Also remember that we have examined other rankers, such as scholarly Nature, which updates monthly, Leiden rankings, which provide actual underlying scholarly data, Ranking Web of Universities, with over 20,000 institutions and  U-Multirank, in a league of its own.

ARWU: Academic Ranking of World Universities, 15 August 2015



BIBLIOMETRIC DATA Source: article type publications, Thomson Reuters Science Citation Index (SCI-e Expanded and Social Science Citation Index (SSCI) for past year, i.e. 2014

FIRST Year Available: 2003

NUMBER of universities – 500; 100 ranked; World #1 Harvard

NUMBER of Asia/Pacific universities -114;  in top 100: 4, Asia #1Tokyo 

FIRST ranking: 2003; in 2003, nine of top ten were Japanese; five are Japanese in 2015

See Table 15 ARWU: Top Ten World and Asian Universities: 2003, 2014, 2015

OUTPUT:  World and National Rank and score; display six metric scores but no resorting, downloading or filtering by geography or subject

NOTES: 30% of rankings based on Nobel Prize and award winners; 90% are size dependent; only 100 have individual ranks. Ruth’s Rankings 6.

QS:  World University Rankings®, 15 September 2015



Registration required for full access

BIBLIOMETRIC DATA Source: Scopus used for citations, normalized by faculty and subject.

NUMBER of Universities – > 700; 400 with individual ranks; World #1 MIT.

NUMBER of Asian Universities (ex Middle East) in top 100- 19 Asia #1: NUS

FIRST ranking 2004 (jointly with THE) 200 ranked – 2 each from China, Hong Kong, Singapore and 3 from Japan and one from India; in 2015 2 each from China, Hong Kong, Japan, Singapore and South Korea

See Table 15 QS: Top Asian Universities comparing 2004 and 2015-2016

OUTPUT:  World rank and score; re-sort and re-rank on six indicators; filter by location and faculty (field)

QS provides a digital supplement with more in-depth analysis and scores in chart format (as PDF for top 400 with registration)

CHANGES: Normalizing citations by faculty for subject to balance the high number of citations in the life sciences. Universities strengths in the social sciences, arts and humanities, and engineering and technology may place higher.

NOTES: 50% of QS rankings are based on reputation with only 20% using bibliometrics.  National University of Singapore and Nanyang Technological University ranked 12 and 13 in the world in 2015, a miraculous rise from 22 and 39 in 2014; QS provides a separate Asia ranking using different methodology.  Ruth’s Rankings 5.

THE Times Higher Education World University Rankings 2015-2016, 30 September 2015 RANKING URL:!/page/0/length/25


BIBLIOMETRIC DATA SOURCE -MAJOR CHANGE Elsevier/Scopus data including journal articles, conference proceedings and reviews. Citations to these papers from the six years from 2010 to 2015 and normalized by subject.

NUMBER of Universities- Increased to 800 with 200 ranked; World # 1: Cal Tech

Number of Asian countries in top 100- nine – Asia #1  NUS

FIRST ranking:  See QS above for 2004 ranking; in 2015,  THE has 2 each from China, Hong Kong, Japan, Singapore and South Korea with one different university than QS from South Korea.

See Table 15 THE: Compared with QS and with THE 2014 Rankings

OUTPUT: Ranking and composite score for top 200; 5 individual Performance scores and 4 institution supplied scores, sorting and re-ranking for all 800.  Filter by country

NOTES:  33% Reputation  (Only 100 scores displayed)

THE recommends that comparisons should not be made with past years because of the change in underlying data, but that did not stop me.  Eight out of ten are the same; Cal Tech is still number one, there are two more European universities in place of U.S. ones.  Seven of the original joint 2004 universities are still in the top 10, the major change being a drop from Harvard first in 2004 and sixth in 2015.

In addition to the change in underlying data a “very small number of papers” with over 1,000 authors were removed from the data. Ruth’s Rankings 5.

US NEWS:  Best Global University Rankings 6 October 2015



BIBLIOMETRIC DATA Source:  Thomson Reuters InCitesTM  ; for 2015, using publications from 2009 – 2013

NUMBER of Universities -750, all having ranks; World #1 Harvard

NUMBER of Asian institutions in top 100 – 8; Tokyo #1; 4 from China, 2 each from Japan and Singapore and one from Hong Kong and South Korea; total number for all Asia is 185, including some not in top 750 ranking

FIRST RANKING: This is the second edition

See Table 15 US News: Compared with THE and QS. These three rankers use reputation as a key metric and have a broad target market.

CHANGES: Added books and conferences to article publications

OUTPUT: Filter by region, country, city and subject; US NEWS only displays the rank.  To see the global score and individual ranks, select one institution at a time.

NOTES: 25% Global and regional reputations rankings; US News is frustrating because of an inability to do any multiple comparisons at one time.  The positive aspects of this ranking are that for its bibliometrics, it uses a combination of size-dependent and size-independent metrics and you can look at an individual institution’s score and see the difference in ranks between the total number of publications and citations and the number of publications in the top 10% of highly cited or the normalized citation impact. Ruth’s Rankings 11.

National Taiwan University Rankings– Performance Rankings of Scientific Papers for World Universities 2015 – 10 October 2015



BIBLOMETRIC DATA Source: Thomson-Reuters Essential Science Indicators, Science Citation Index and Social Science Citation Index for past 11 years and past year; for 2015 using 2004-2014 and 2014 for publications; specific Thomson Reuters source and years vary per indicator

NUMBER of Universities:  495 ranked for composite plus 29 universities without composite scores but ranked on the size independent publications /faculty; World # 1 Harvard

NUMBER of Asian Universities: All Asia includes110 of 500 with additional 15; Asia #1 U Tokyo

FIRST RANKINGS:  2007 HEEACT; first National Taiwan, 2012

See Table 15 NTU for Performance Ranking of Scientific Papers for 2015, 2014 and HEEACT

OUTPUT:  Filter by continent and country; displayed in rank order; all universities are ranked and the 11 other scores can all be re-ranked.

NOTES:  Except for the “Reference Rank”, all other NTU metrics are size dependent and include 11 years of publications.  An advantage of this ranking is the ability to re-rank all of the metrics. Ruth’s Rankings 6 .


Reuters Top 100 World’s Most Innovative Universities



This new rankings fits into our bibliometric focus since  “the process began by identifying the 500 academic and government organizations that published the greatest number of articles in scholarly journals from 2008 to 2013, as indexed in the Thomson Reuters Web of Science Core Collection database.”  Data are extracted from InCites, Web of Science Core Collection, Derwent Innovations Index, Derwent World Patents Index, and Patents Citation Index.

The universities come from 14 countries with half from the United States.  Japan is second with nine, South Korea fourth with eight and   China 12th and Singapore 14th with one each. The list uses university “systems” rather than the individual flagship universities and it only provides a score.  Eleven of the 19 Asian innovative universities appear on one of our newest top ten lists.  Only two of the world’s top ten universities in our latest updates did not appear on the innovation list.  Thomson Reuters also compiles a company list of the 100 top innovators, using patents cited by other companies as one of their metrics.

Scimago Institutions Rankings, no longer available to the public, had been calculating innovation knowledge rankings from 2009 to 2014 and that is still available to the public but it has not been updated. It ranks over 500 universities and uses PATSTATS for its data.

New Metric: Web of Science article level  “Usage Count”

Altmetric, Google Scholar and the growing repositories such as Mendeley and Research Gate provide data based on looks or downloads or social media activity.  Scopus incorporates some of these measures for its Scopus articles.

Web of Science released “Usage Count” in September 2015.   Thomson Reuters defines this as non-citation “researcher interest” in WOS indexed articles.  These actions include “Clicking through from WOS records to the full-text of an item; direct exports of records to bibliographic management tools; Exports of records into formats that can be imported in to bibliographic management tools.   Data are counted since 1 February 2013 or a daily change for a rolling 180 days for all records on the WOS platform.”  At this time, these data have not been incorporated into the Incites analytical tools. The video is available in multiple languages from our Web of Science training page:

Figure 15.1 provides examples of “Usage Count” for a pre-publication article and for a highly cited article.  A third example highlights the relationship between cites and usage and sees this as another aspect of bibliometrics for the scientometricians to analyze.

In comparing “Usage Count” to the Scopus metrics, I noticed that Scopus had changed its article level metrics, incorporating Snowball Metrics (see Ruth’s Rankings 14)  as shown in Figure 15.1 example 4.

In November, I will be talking about Ruth’s Rankings at a meeting of the Library Association of Singapore and at Collnet 2015.


Analysis of 2004 THE/QS Rankings: World University Rankings, 2004  Modified from the Times Higher Education Supplement 2004 World University Rankings (Entry by Prachayani Praphamontripong and Daniel Levy) University of Albany, undated, retrieved 17 October, 2015.

Ruth’s Rankings

A list of Ruth’s Rankings is here.

*Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.   Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS.