By Ruth A. Pagell*
(22 June 2016) Most universities will never reach the top 200 in composite rankings; nor are world class composite rankings relevant for most of the worlds’ universities. If you take the time to investigate the different rankers and their specific subject categories, you may find hidden performance gems at a subject level. You will also find discrepancies among categories, metrics and the national evaluations.
Ruth’s Rankings18 presents background information and composite rankings for Australian and New Zealand universities. When evaluating the performance of their higher education institutions at a national level, both countries focus on specific subject areas. See Table18.3 for composite rankings for Australian universities and Table18.4 for New Zealand universities.
To prepare Ruth’s Rankings19, I examined subject rankings in three ways: fixed rankings from the global rankers; interactive data from Web of Science and SciVal to create customized rankings; and national rankings or ratings from the Australian and New Zealand oversight bodies.
SUBJECT RANKINGS
Each ranking or rating source uses its own definitions of subject categories, its own set and calculations of metrics, different qualifying universities, time periods and minimum requirements. They also make changes in methodology year-on-year. A precise comparison among rankers or a precise year-on-year comparison for the same ranker may not be possible. A hint for analyzing subject categories is to identify areas where your country is outperforming a world average or your university is outperforming its composite world rank.
Standard Ranking Sources
Appendix A includes tables (19.A.1 to 19.A.6) for broad subject categories for these sources. Tables exemplify different aspects of subject rankings.
Table 19.A.1: Times Higher Education World University Rankings uses six broad subjects and in 2016 ranked the top 100 using Scopus citation data and in 2011 ranked the top 50 using Thomson-Reuters data. Because of the changes in bibliometric data sources and additional universities, meaningful comparisons cannot be made, but this illustrates the effect of methodological changes. Because of the broad subject categories and limited number of universities ranked per category, there are no surprises in this ranking.
Table 19.A.2: QS Top Universities World University Rankings includes the top 400 in five broad “faculty” categories (2015). The table shows the top ten in each category. Seventeen Australian universities are on the list. The table also shows all New Zealand universities. QS further evaluates universities in 42 subjects for (2016) to be discussed later.
Table 19.A.3: US News Global incorporates InCites Essential Science Indicators’ 22 subjects into its global interface. Some of these categories map to the other rankers’ broader fields and some to specific subjects. The number of ranked universities per category varies by subject from 100-250.
Table 19.A.4: Academic Ranking of World Universities (ARWU) applies five fields and five subjects, selecting the top 200 from a pool of 1,200 universities. The 2016 rankings are available on 15 June. When using ARWU for subjects, check the methodology to see if this ranker is best for your university.
Table 19.A.5: National Taiwan University Performance Rankings of Scientific Papers for World Universities has six “fields” and 14 subjects to be examined later.
Table 19.A.6: CWTS Leiden Rankings does not have composite scores. The 2016 default list is arranged by total publications, based on fractionalized count. Until 2016, the default was percent of papers in top 10%, based on fractionalized count. All universities that are covered are ranked by subject field and subject. Using the top 10% methodology, smaller universities rise to the top.
Also included in Table 19.A.6 are Australian and New Zealand universities in the top 100 in the 2016 Nature Index which has four broad subject categories. All institutions are listed under their countries by subject; however the global list includes only the top 100. Nature will release the Nature Index 2016 Australia & New Zealand for the first time in October 2016. According to the website, “It will examine the performance of individual institutions in the Nature Index based on their output of high quality science and their patterns of collaboration with domestic and international universities.”
INCITES and SCIVAL
WOS’ InCites and Elsevier’s SciVal provide interactive subject-level access to institutional and country level data for a wide range of metrics. InCites schemas include Essential Science Indicator’s 22 subjects used by U.S. News Global, the Web of Science schema with 251 different specific subjects, covering all fields from Acoustics to Zoology and the six category GIPP (Global Institutions Profiles Project), which is currently used in THE’s subject rankings. InCites also maps to the Australian Schema at the two and four digit description levels. Other mapping options include the following:
- ANVUR (14) – Italian evaluation;
- CHINA SCAD 77 narrow Subject Categories by State Council of China;
- FAPESP – Used by Brazilian state of San Paolo, with 9 broad and 81 narrow categories;
- OECD – 45 subjects mapping to OECD data with six broad categories, useful when comparing bibliometric data with country level statistical data;
- UK REF 2014 (36) – Mapped to InCites; This schema includes sports, one of New Zealand’s subjects not elsewhere mapped. New Zealand ranked 13 in the world in output using this schema.
- KAKEN – Japanese grants database with 10 broad and 66 specific categories, the newest addition to the schema choices.
SciVal uses 24 Scopus subject categories that can be expanded to over 300 research areas and five metrics: publications, authors and citations, citations per publication and field weighted citation impact. SciVal generates ranked lists for the world, region and individual country. The State of Australia evaluation report used Scopus citation data for the 2015/2016 report and it is still looking to determine what data sets will be used for the next round of evaluation.
If you have a subscription to either of these services, you can create your own rankings using your choice of category, metric and time period.
AUSTRALIA AND NEW ZEALAND
Subject rankings are important for Australia and New Zealand and both of their internal evaluations are at a subject-specific level. It is in specific subject areas that some regional and local universities are able to shine. In order to get a better sense of how these countries and their institutions are performing, I drilled down into the underlying data and compared nationally rated strengths with world rankings. Figures 19.1, 19.2, 19.3 and 19.4 compare the top subjects for Australia and New Zealand against world norms using InCites Australia Level1 and ESI schemas and SciVal.
ERA (Excellence in Research for Australia | Australian Research Council) uses a schema of 2- and 4-digit discipline codes applied to each university. The State of Australia Universities first examines each subject category in depth and then provides ratings for 41 universities. The ratings are based on “the principle of expert review informed by indicators.” Discipline specific research evaluation committees rate institutions, from five, well above world standard to one, well below world standard.
Table 19.2 is in two parts. It includes a chart showing the ERA distribution of ratings for two digit codes and a comparison of those rankings to Web of Science publications and CNCI (Category Normalized Citation Impact). Appendix B drills down into category 07 Agricultural and Veterinary Sciences with three examples from State of Australian Research and two tables. Examples are Codes and University ratings; Codes and Metrics; and Analysis. The tables compare ERA ratings with InCites and SciVal and also include New Zealand.
New Zealand
Ruth’s Rankings 18 explains the New Zealand bodies involved in evaluating and funding tertiary education. The Tertiary Education Commission conducted three rounds of evaluation for PBRF – performance based research funding. Institutions receive funding allocations based on the quality evaluation 60%, research degree completions (RDC), 25%, and external research income (ERI), 15%. New Zealand has only eight universities that are included in this article’s rankings. Appendix C: Figure 19.C.1 shows the allocation of funding for the universities. Appendix C: Figure 19.C.2 contains all of the PBRF categories.
It is difficult to carry out comparisons because New Zealand’s 42 subjects do not neatly map to any one schema. Table 19.C.1 (in Appendix C) is a comparison based on the closest subject match, with the QS categories. QS’ methodology includes four indicators with weightings modified by discipline: academic and employer reputation; research citations per paper and h-index. At least 75% of the weightings for subject are based on reputation. The most recent ranking is 2012. The next evaluation is scheduled for 2018.
New Zealand’s quality scores do not align well with external metrics and New Zealand universities do not appear in many subject rankings using a top 100 or 200 cutoff.
CONCLUSION
Every ranking and rating source uses its own subject classifications, from very broad fields to department level terminology. They use different metrics to determine performance within the category. Countries and institutions need to determine whether being tops on the world stage is more important than meeting the needs of your population. This is clear from our examination of New Zealand where the terminology and scoring used to fund research does not map to the country’s comparative advantage on the world stage or to the areas that give countries and institutions higher rankings.
References:
Appendix D – a concordance of ERA, PRBF, InCites ESI and SciVal subject categories.
State of Australian University Research Volume 1 ERA National Report (2015). Commonwealth of Australia.
Performance-Based Research Fund: Evaluating Research Excellence – the 2012 Assessment (October 2013). Tertiary Education Commission.
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674.