By Ruth A. Pagell*
(26 October 2015) Mid-August to mid-October is a busy time for those of us trying to keep up with the updates in global rankings. Five of our primary sources for composite rankings posted new rankings during this time period. Waiting until mid-year gives the rankers time to incorporate 2014 data.
THE World University Rankings use of data from Scopus rather than from Thomson-Reuters is the major change in methodology but it does not result in major changes in rankings. Other minor changes in rankings may be due to changes in methodology by the ranker or by the underlying data provider. For example, in 2014, Thomson Reuters changed its highly cited researchers list, which included lowering the number of researchers.
In Part 1 of this article, I update the changes in the composite rankings ARWU, QS, THE, U.S. News and NTU-Taiwan. For more in-depth information about each ranking, refer to the original Access articles. In Part 2, I introduce Reuters Top 100 World’s Most Innovative Universities and Thomson Reuters Web of Science “Usage Count.”
Part 1: UPDATED RANKINGS
For each of the five rankers, I provide similar information and tables. For those of you not interested in the individual rankers, I have created overview Table 15.1 and Table 15.2 , listing the top World and Asian universities for these five rankers, using their composite scores. Only Harvard, Stanford and Massachusetts Institute of Technology (MIT) are in the top ten for all world rankings. National University of Singapore (NUS), Peking University, University of Tokyo and Kyoto University are all in the top ten for Asia. Table 15.3 provides more detailed information on the type of data you can find for an individual university including examples from THE and U.S. News.
The purpose of this and the articles that preceded it is not to tell you what your organization should use for its analysis but to provide the tools to select the most applicable ranking sources. Also remember that we have examined other rankers, such as scholarly Nature, which updates monthly, Leiden rankings, which provide actual underlying scholarly data, Ranking Web of Universities, with over 20,000 institutions and U-Multirank, in a league of its own.
ARWU: Academic Ranking of World Universities, 15 August 2015
RANKING URL: http://www.shanghairanking.com/
METHODOLOGY URL: http://www.shanghairanking.com/ARWU-Methodology-2015.html
BIBLIOMETRIC DATA Source: article type publications, Thomson Reuters Science Citation Index (SCI-e Expanded and Social Science Citation Index (SSCI) for past year, i.e. 2014
FIRST Year Available: 2003
NUMBER of universities – 500; 100 ranked; World #1 Harvard
NUMBER of Asia/Pacific universities -114; in top 100: 4, Asia #1Tokyo
FIRST ranking: 2003; in 2003, nine of top ten were Japanese; five are Japanese in 2015
See Table 15 ARWU: Top Ten World and Asian Universities: 2003, 2014, 2015
OUTPUT: World and National Rank and score; display six metric scores but no resorting, downloading or filtering by geography or subject
NOTES: 30% of rankings based on Nobel Prize and award winners; 90% are size dependent; only 100 have individual ranks. Ruth’s Rankings 6.
QS: World University Rankings®, 15 September 2015
Registration required for full access
BIBLIOMETRIC DATA Source: Scopus used for citations, normalized by faculty and subject.
NUMBER of Universities – > 700; 400 with individual ranks; World #1 MIT.
NUMBER of Asian Universities (ex Middle East) in top 100- 19 Asia #1: NUS
FIRST ranking 2004 (jointly with THE) 200 ranked – 2 each from China, Hong Kong, Singapore and 3 from Japan and one from India; in 2015 2 each from China, Hong Kong, Japan, Singapore and South Korea
See Table 15 QS: Top Asian Universities comparing 2004 and 2015-2016
OUTPUT: World rank and score; re-sort and re-rank on six indicators; filter by location and faculty (field)
QS provides a digital supplement with more in-depth analysis and scores in chart format (as PDF for top 400 with registration)
CHANGES: Normalizing citations by faculty for subject to balance the high number of citations in the life sciences. Universities strengths in the social sciences, arts and humanities, and engineering and technology may place higher.
NOTES: 50% of QS rankings are based on reputation with only 20% using bibliometrics. National University of Singapore and Nanyang Technological University ranked 12 and 13 in the world in 2015, a miraculous rise from 22 and 39 in 2014; QS provides a separate Asia ranking using different methodology. Ruth’s Rankings 5.
THE– Times Higher Education World University Rankings 2015-2016, 30 September 2015 RANKING URL:https://www.timeshighereducation.com/world-university-rankings/2016/world-ranking#!/page/0/length/25
METHODOLOGY URL: https://www.timeshighereducation.com/news/ranking-methodology-2016
BIBLIOMETRIC DATA SOURCE -MAJOR CHANGE: Elsevier/Scopus data including journal articles, conference proceedings and reviews. Citations to these papers from the six years from 2010 to 2015 and normalized by subject.
NUMBER of Universities- Increased to 800 with 200 ranked; World # 1: Cal Tech
Number of Asian countries in top 100- nine – Asia #1 NUS
FIRST ranking: See QS above for 2004 ranking; in 2015, THE has 2 each from China, Hong Kong, Japan, Singapore and South Korea with one different university than QS from South Korea.
See Table 15 THE: Compared with QS and with THE 2014 Rankings
OUTPUT: Ranking and composite score for top 200; 5 individual Performance scores and 4 institution supplied scores, sorting and re-ranking for all 800. Filter by country
NOTES: 33% Reputation (Only 100 scores displayed)
THE recommends that comparisons should not be made with past years because of the change in underlying data, but that did not stop me. Eight out of ten are the same; Cal Tech is still number one, there are two more European universities in place of U.S. ones. Seven of the original joint 2004 universities are still in the top 10, the major change being a drop from Harvard first in 2004 and sixth in 2015.
In addition to the change in underlying data a “very small number of papers” with over 1,000 authors were removed from the data. Ruth’s Rankings 5.
US NEWS: Best Global University Rankings 6 October 2015
BIBLIOMETRIC DATA Source: Thomson Reuters InCitesTM ; for 2015, using publications from 2009 – 2013
NUMBER of Universities -750, all having ranks; World #1 Harvard
NUMBER of Asian institutions in top 100 – 8; Tokyo #1; 4 from China, 2 each from Japan and Singapore and one from Hong Kong and South Korea; total number for all Asia is 185, including some not in top 750 ranking
FIRST RANKING: This is the second edition
See Table 15 US News: Compared with THE and QS. These three rankers use reputation as a key metric and have a broad target market.
CHANGES: Added books and conferences to article publications
OUTPUT: Filter by region, country, city and subject; US NEWS only displays the rank. To see the global score and individual ranks, select one institution at a time.
NOTES: 25% Global and regional reputations rankings; US News is frustrating because of an inability to do any multiple comparisons at one time. The positive aspects of this ranking are that for its bibliometrics, it uses a combination of size-dependent and size-independent metrics and you can look at an individual institution’s score and see the difference in ranks between the total number of publications and citations and the number of publications in the top 10% of highly cited or the normalized citation impact. Ruth’s Rankings 11.
National Taiwan University Rankings– Performance Rankings of Scientific Papers for World Universities 2015 – 10 October 2015
BIBLOMETRIC DATA Source: Thomson-Reuters Essential Science Indicators, Science Citation Index and Social Science Citation Index for past 11 years and past year; for 2015 using 2004-2014 and 2014 for publications; specific Thomson Reuters source and years vary per indicator
NUMBER of Universities: 495 ranked for composite plus 29 universities without composite scores but ranked on the size independent publications /faculty; World # 1 Harvard
NUMBER of Asian Universities: All Asia includes110 of 500 with additional 15; Asia #1 U Tokyo
FIRST RANKINGS: 2007 HEEACT; first National Taiwan, 2012
See Table 15 NTU for Performance Ranking of Scientific Papers for 2015, 2014 and HEEACT
OUTPUT: Filter by continent and country; displayed in rank order; all universities are ranked and the 11 other scores can all be re-ranked.
NOTES: Except for the “Reference Rank”, all other NTU metrics are size dependent and include 11 years of publications. An advantage of this ranking is the ability to re-rank all of the metrics. Ruth’s Rankings 6 .
Part 2: NEW RANKING and NEW METRIC
Reuters Top 100 World’s Most Innovative Universities
This new rankings fits into our bibliometric focus since “the process began by identifying the 500 academic and government organizations that published the greatest number of articles in scholarly journals from 2008 to 2013, as indexed in the Thomson Reuters Web of Science Core Collection database.” Data are extracted from InCites, Web of Science Core Collection, Derwent Innovations Index, Derwent World Patents Index, and Patents Citation Index.
The universities come from 14 countries with half from the United States. Japan is second with nine, South Korea fourth with eight and China 12th and Singapore 14th with one each. The list uses university “systems” rather than the individual flagship universities and it only provides a score. Eleven of the 19 Asian innovative universities appear on one of our newest top ten lists. Only two of the world’s top ten universities in our latest updates did not appear on the innovation list. Thomson Reuters also compiles a company list of the 100 top innovators, using patents cited by other companies as one of their metrics.
Scimago Institutions Rankings, no longer available to the public, had been calculating innovation knowledge rankings from 2009 to 2014 and that is still available to the public but it has not been updated. It ranks over 500 universities and uses PATSTATS for its data.
New Metric: Web of Science article level “Usage Count”
Altmetric, Google Scholar and the growing repositories such as Mendeley and Research Gate provide data based on looks or downloads or social media activity. Scopus incorporates some of these measures for its Scopus articles.
Web of Science released “Usage Count” in September 2015. Thomson Reuters defines this as non-citation “researcher interest” in WOS indexed articles. These actions include “Clicking through from WOS records to the full-text of an item; direct exports of records to bibliographic management tools; Exports of records into formats that can be imported in to bibliographic management tools. Data are counted since 1 February 2013 or a daily change for a rolling 180 days for all records on the WOS platform.” At this time, these data have not been incorporated into the Incites analytical tools. The video is available in multiple languages from our Web of Science training page: http://wokinfo.com/training_support/training/web-of-science/
Figure 15.1 provides examples of “Usage Count” for a pre-publication article and for a highly cited article. A third example highlights the relationship between cites and usage and sees this as another aspect of bibliometrics for the scientometricians to analyze.
In comparing “Usage Count” to the Scopus metrics, I noticed that Scopus had changed its article level metrics, incorporating Snowball Metrics (see Ruth’s Rankings 14) as shown in Figure 15.1 example 4.
In November, I will be talking about Ruth’s Rankings at a meeting of the Library Association of Singapore and at Collnet 2015.
Analysis of 2004 THE/QS Rankings: World University Rankings, 2004 Modified from the Times Higher Education Supplement 2004 World University Rankings (Entry by Prachayani Praphamontripong and Daniel Levy) University of Albany, undated, retrieved 17 October, 2015. http://www.albany.edu/dept/eaps/prophe/data/International_Data/WorldUniversityRanking2004_ModifiedFromTHES.pdf
- Introduction: Unwinding the Web of International Research Rankings
- A Brief History of Rankings and Higher Education Policy
- Bibliometrics: What We Count and How We Count
- The Big Two: Thomson Reuters and Scopus
- Comparing Times Higher Education (THE) and QS Rankings
- Scholarly Rankings from the Asian Perspective
- Asian Institutions Grow in Nature
- Something for Everyone
- Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
- Do-It-Yourself Rankings with InCites
- U S News & World Report Goes Global
- U-Multirank: Is it for “U”?
- A look back before we move forward
- SciVal – Elsevier’s research intelligence – Mastering your metrics
- Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
- The much maligned Journal Impact Factor
- Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
- Rankings from Down Under – Australia and New Zealand
- Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
- World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
- Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
- Indian University Rankings – The Good the Bad and the Inconsistent
- Are Global Higher Education Rankings Flawed or Misunderstood? A Personal Critique
- Malaysia Higher Education – “Soaring Upward” or Not?
- THE Young University Rankings 2017 – Generational rankings and tips for success
- March Madness –The rankings of U.S universities and their sports
- Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
- Japanese Universities: Is the sun setting on Japanese higher education?
- From Bibliometrics to Geopolitics: An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
- Hong Kong and Singapore: Is Success Sustainable?
- Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
- The Business of Rankings – Show me the money
- Authors: People and processes
- Authors: Part 2 – Who are you?
- Come together: May updates lead to an investigation of Collaboration
- Innovation, Automation, and Technology Part 1: From Scholarly Articles to Patents; Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
- How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
- Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
- Business School Rankings: Monkey Business for an Asia/Pac audience
- Deconstructing QS Subjects and Surveys
- THE’s University Impact Rankings and Sustainable Development Goals: Are these the most impactful universities in the world?
- ASEAN – a special analysis of ASEAN nations
- Predatory practices revisited – misunderstandings and positive actions
*Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS. https://orcid.org/0000-0003-3238-9674