By Ruth A. Pagell*
(26 December 2014) I began researching the intersection of bibliometrics and university research rankings in preparation for a talk I presented at CONCERT, CONsortium on Core Electronic Resources in Taiwan in 2008. At that time, there were only 74 articles in all of Web of Science (WOS) about bibliometrics or citations and universities and research rankings. In the past six years, the number of articles has more than tripled. In 2008, there were only three world-ranking options: the commercial THE-QS World University Rankings; the groundbreaking Academic Rankings of World Universities (ARWU); and the rankings from HEEACT, the Higher Education Evaluation and Assessment Council of Taiwan. We will be featuring at least10 different options in this series.
Last month in Ruth’s Rankings 5, we examined what are now the two highly recognized commercial global rankings which are headquartered in the UK, THE and QS. This month we shift our attention to two rankings from Asian institutions that apply quantitative scholarly metrics: the Academic Ranking of World Universities from Shanghai Jiao Tong University (ARWU) and Performance Ranking of Academic Papers now complied and published by the National University of Taiwan (NTU) Rankings
There are similarities in the purpose of these rankings but not in the metrics. Both measure academic performance and both use Thomson Reuters data. Both methodologies favor institutions in the sciences and both have separate rankings for different scientific disciplines. Only the metric for research output, measured by articles in WOS, is similar. We will compare the two on methodology, content and rankings for the world and Asia, excluding the Middle East.
The Center for World-Class Universities at Shanghai Jiao Tong University has been compiling ARWU beginning in 2003. It was the first attempt at a global ranking methodology based on scientific performance. In addition to ranking the top 500 world universities and top 200 in selected fields and subjects, the Center compiles data on 1,200 universities through its Global Research Universities Profiles (GRUP).
ARWU rankings include all universities with alumni and staff winning Nobel Prizes and Fields Medals, highly cited researchers, authors of papers published in Nature and Science and universities with significant amounts of papers indexed by Science Citation Index-Expanded (SCIE) or Social Science Citation Index (SSCI)
In 2014, the top 500 come from 42 different countries with only three countries, the U.S., U.K. and Switzerland making up the top 20. See Figure 1 for the number of Asian universities by country included in the top 500.
ARWU assigns a score of 100 to the highest ranked institution in each category. They calculate scores for the other institutions as a percentage of the top score. An institution’s rank reflects the number of institutions that sit above it. Harvard is ranked number 1 overall and its score is 28 points higher than its nearest competitor.
Unique to ARWU is basing 30% of its ranking on Nobel Prizes and Field Medals in Mathematics.
The interface displays the world rank, national rank and total score. You have the ability to see the score for each of six individual categories. It is possible to download the whole list that includes all six categories: Alumni, Awards, Highly Cited; Nature & Science, PUBlications and Publication Per Capita. You can then re-sort or re-weight your spreadsheet on categories that are important for your institution. There are no separate rankings for countries or regions although a national rank is provided. You can also click on the “Universities” tab to search for a profile of an individual university.
ARWU modifies its rankings for fields as shown in Table 2. .
The NTU Rankings are a continuation of the rankings from HEEACT, first released in 2007. The initial rankings included the top 500 universities worldwide and field rankings. In 2010, NTU took over the field rankings and in 2012, NTU officially took over the execution and release of the rankings. Today the rankings include 6 fields and 14 subjects from four of those fields.
Universities are included based on data from Thomson Reuters Essential Science Indicators (ESI) and comparisons with ARWU, THE, QS, and U.S. News, creating a core group of 903 universities. To get more complete data for the past full year, data are extracted in April and published in October.
Emphasis is on scholarly output as measured by the bibliometrics we discussed in Ruth’s Rankings 3. The rankings use a variety of metrics derived from the numbers of publications and citations from the past 11 years and the current year. Unique to NTU Rankings is an institution’s h-index for the past two years (see Ruth’s Rankings 3)
NTU also uses scores, with the highest-ranking university in each category getting a score of 100. The composite score for Harvard, the number one university, is 98.The differentials are much smaller in the NTU rankings with the number 2 university just 7.4 points below Harvard and the number 30 university 28 points below. A separate Reference Ranking normalizes scores for the number of articles and citations based on the number of full time faculty. Institutions from 44 countries are in the list of 500 with four countries, US, UK, Canada and Japan placing at least one university in the top 20. See Table 3, NTU Methodology and Content.
NTU does not have separate rankings with different weightings by continent or country but you can filter the list to get the universities in Asia/Pacific, for example. Ten universities from China, Japan, Singapore, South Korea and Taiwan are in the top 100 worldwide. China and Japan are highly represented in the top 500, with 29 universities from China and 20 from Japan, more than the number from Australia. Other East Asian countries included in the top 500 are Hong Kong, India, Malaysia and Thailand.
The interface only displays total score and the normalized reference rank. I discovered that by cutting and pasting the data into a spreadsheet I had downloaded all 10 individual scores. Several smaller institutions, not in the top 20 for the region, score highly on average number of citations per paper.
There are separate listings for the subfields in each of the six fields and a list of all the specific subjects within a subject category. The categories in these rankings are heavily weighted toward those universities that are strong in the sciences. There is nothing in the methodology to indicate that scores are adjusted to account for the varying number of citations among disciplines.
Table 4 displays the comparison rankings of ARWU and NTU for the world and Asia. Table 5 (ARWU) and Table 6 (NTU) display the individual rankings with the placement of the Asian universities in world rankings. Given the different scholarly characteristics these two rankings use, only two of the top five are the same, Harvard and Stanford, with a total of five of the top ten being the same in 2014: University of California Berkeley, Massachusetts Institute of Technology and Columbia University. The Asian rankings show more similarities, with eight of NTU’s top Asian universities in ARWU’s top 12 list, as shown in Table 4.
Table 5 also displays the rankings from the 2003 rankings. There is no change in the universities making up the ARWU’s World top 11 in the initial 2003 rankings and the 2014 rankings. There is also no change in the top 11 from the original HEEACT rankings in 2003 to the 2014 NTU rankings (Table 6). Given the methodologies used by both systems, this is not surprising especially for the universities at the top. For example, for the ARWU rankings from 1901 to 2014, there have only been 889 individuals receiving Nobel prizes in five disciplines in addition to Peace. For NTU, three of their rankings compile data over an 11 year period.
A broader range of universities is represented in the field and subject rankings, as shown in Table 7. What is notable in these field rankings is the prominence of Asian universities in the scientific categories, primarily engineering. Tsinghua University is number one in three ARWU fields
The ARWU and NTU rankings are most interesting to policy and government agencies and research universities and scholars Because both are based on long term measurements, there is less fluctuation at the top and there is less that a government body or an individual institution can do in the short term to make changes.
In the next article we look at how two research institutions, Scimago Labs (Spain) and Leiden University’s Center for Science and Technology Studies (Netherlands), further manipulate the data.
- Introduction: Unwinding the Web of International Research Rankings
- A Brief History of Rankings and Higher Education Policy
- Bibliometrics: What We Count and How We Count
- The Big Two: Thomson Reuters and Scopus
- Comparing Times Higher Education (THE) and QS Rankings
- Scholarly Rankings from the Asian Perspective
- Asian Institutions Grow in Nature
- Something for Everyone
- Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
- Do-It-Yourself Rankings with InCites
- U S News & World Report Goes Global
- U-Multirank: Is it for “U”?
- A Look Back Before We Move Forward
- SciVal – Elsevier’s research intelligence – Mastering your metrics
- Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
- The much maligned Journal Impact Factor
- Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
- Rankings from Down Under – Australia and New Zealand
- Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
- World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
- Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
- Indian University Rankings – The Good the Bad and the Inconsistent
- Are Global Higher Education Rankings Flawed or Misunderstood? A Personal Critique
- Malaysia Higher Education – “Soaring Upward” or Not?
- THE Young University Rankings 2017 – Generational rankings and tips for success
- March Madness –The rankings of U.S universities and their sports
- Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
- Japanese Universities: Is the sun setting on Japanese higher education?
- From Bibliometrics to Geopolitics: An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
- Hong Kong and Singapore: Is Success Sustainable?
- Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
- The Business of Rankings – Show me the money
- Authors: People and processes
- Authors: Part 2 – Who are you?
- Come together: May updates lead to an investigation of Collaboration
- Innovation, Automation, and Technology Part 1: From Scholarly Articles to Patents; Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
- How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
- Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
- Business School Rankings: Monkey Business for an Asia/Pac audience
- Deconstructing QS Subjects and Surveys
- THE’s University Impact Rankings and Sustainable Development Goals: Are these the most impactful universities in the world?
*Ruth A. Pagell is currently teaching in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS. https://orcid.org/0000-0003-3238-9674