By Ruth A. Pagell*
Q: What is the highest ranked Asian university for Mulltidisciplinary and how old is it?
Q: What is the top university in Oceania for multidisciplinary research? (Appendix A)
Q: What Asian university has the most alumni working in a Forbes 2000 company? (Appendix B)
Q: In 2020, what university had the most research income per institutional income in Natural Sciences? (Appendix C)
Q: What university is number one in Library & Information Science? (Appendix D)
(12 Aug 2021) I would never go shopping in a clothing store where all the clothes were one size, one colour, and unisex. That is like expecting to find all the information needed about a university in one ranking. As I proofread this article, and was having an email discussion with Ural Akbulut, head of URAP (University Ranking by Academic Performance), we agreed that decision makers need to consult multiple rankings.
In my personal quest for new metrics, I used IREG approved rankings. Ruth’s Rankings 47 Part 1, “Hunting for New University Rankings Metrics” provides background information on the need to re-evaluate existing rankings. New rankings continue to appear (IREG, Eurasian) and methods of aggregating rankings have been explored (Holmes). Individual criticisms of existing rankings continue, an example being the recent article on global rankings and recolonisation (Wan). The International Network of Research Management Societies (INORMS) Research Evaluation Working Group has been working on a method to rate global university rankings. They developed a framework for evaluating rankings based on Good Governance, Transparency, Measure what Matters, and Rigour (Gadd, Holmes, & Shearer).
I began my research by asking myself three questions:
- Will I uncover any new and relevant measurement areas, especially any based on the expanding role of universities?
- Will there be noticeable differences among the rankings of top universities?
- Will the functionality of the ranking interface allow me to customize the data to my needs?
This article provides brief overviews and an individual more detailed appendix for each ranking. The appendices have similar details and their current world and Asia top 10. See Tables 47.2.1 for top 10 in the world and Table 47.2.2 for the top 10 in Asia across multiple rankings and analysis of the results. MIT, Harvard, and Stanford have the most top 10 appearances in the world, while Tsinghua, Peking, National University of Singapore, and University of Tokyo are leaders in Asia.
“NEW” RANKINGS USING CLARIVATE METRICS:
These rankings use data from Clarivate for their bibliometrics. Methodologies vary from 100% bibliometrics to 13%.
Ruth’s Rankings 47 Part 1 covered WURR, World University Research Ranking. The first edition compares QS’ 2020 world top 250 universities with the WURR rankings, based on three bibliometric aspects: Research Impact, Research Collaboration, and a new metric, Research Multidisciplinary’ with seven equally weighted bibliometric indicators. The dataset includes 108 universities from Europe, 65 from North America, 51 from Asia,19 from Oceania and 7 from Africa and South America. Using these metrics results In a list of top universities that differ from other rankings.
Unique Indicator: Multidisciplinary which results in the most unique ranking results.
Top 10 Rankings: Two world top 10 in any other top 10 rankings, MIT and ETH Zurich. Eight Asian top ten in other rankings.
Functionality: Re-sort by the three areas, location, size, and age. Cut and paste to download.
First published in 2012, the 2021-2022 edition has 2,000 universities from 94 countries. Universities are ranked on Quality of Education, Alumni Employment, Quality of Faculty, each with only one metric, and Research Performance, with four indicators. 40% of the indicators are from bibliometric data. The interface includes the world and country ranks, the rank for each indicator, and a total score. No data come from the universities or from surveys.
According to IREG (CWUR), the rankings are unusual. I find the methodology to be questionable. One metric is insufficient to measure the Quality of Education or Quality of Faculty. Having a senior position in a Forbes 2000 company might be appropriate for business school rankings. Some universities only have a rank for Research Performance and an overall score.
Unique Indicators: Metrics for Quality of Education, Alumni Employment, and Quality of Faculty
Top 10 Rankings: Eight of the CWUR’s world top 10 are the same as THE and ARWU; none of the world’s top 10 are unique; three of Asia’s are unique, two from Israel and one from Taiwan.
Functionality: Re-rank by individual country. Cut and paste the entire dataset.
First published in 2013, the 2021-2022 world ranking includes 869 universities from 73 countries. Universities are ranked in four areas: Teaching, Research, International Diversity, and Financial Stability. There are 20 indicators, five per area. 2020 rankings are available for six science areas, Life, Medical, Natural, Social, and Technical, with the update planned for October 2021. Reputation and Academic Rankings have separate rankings. The same rankings data are available for each institution as shown in Appendix C. Russian universities are second to the US in number of universities. 80% of the Russian universities are in the bottom tier.
Data are from Clarivate’s Global institutional profiles project (GIPP), using university supplied information, surveys, and WOS bibliometrics. 80% of the data are size independent.
Unique Indicators: Financial stability, which also results in unique results
Top 10 Rankings: All top ten world in other rankings; two Asian top 10 rankings are the same as two WURR rankings.
Functionality: A user-friendly interface with separate downloadable tables for each indicator, ranking, and country.
Another important factor in evaluating rankings is responsiveness to queries and concerns. I want to thank RUR for the help they provided.
URAP shows academic performance for a large number of universities from a wide range of countries. Founded in 2009 and first online in 2013-2014, the 2000 ranking includes 3,000 universities from 112 countries or regions. The 2021 rankings are scheduled for October. The six bibliometric indicators from WOS and InCites are Article, Citation, Total Document, Article Impact Total, Citation Impact Total, and International Collaboration. Most metrics are size dependent. There are no surprises in re-ranking on the different indicators.
In addition to size of the dataset, 62 up-to-date subjects are positive contributions of URAP. An example is Artificial Intelligence.
Unique Indicators: All indicators are based on data from Clarivate
Top 10 Rankings: One world top 10 and one Asian top ten only appear in URAP’s top 10
Functionality: Capability to re-sort and download the entire dataset.
Thanks to URAP for their assistance.
At first glance, MosIUR is interesting, with a stated purpose to measure the three core areas of universities: Education, Research, and University and Society. 13% of weighting is bibliometric. Other data sources are from websites. It includes 1,500 institutions from 97 countries. This ranking is not recommended for any user. There are no underlying data beyond a score. See Orduña-Malea, E. and Pérez-Esparrells (2021) for an in-depth analysis.
Unique Indicators: Most of the indicators are unique
Top 10 Rankings: Although most MosIUR metrics differ from all other metrics, all of the top ten universities are on at least one other ranking list.
Functionality: None, just a list of universities with a score.
In answer to my original questions:
- I did uncover unique indicators. Unique does not translate into relevant. For the researcher, WURR’s Multidisciplinary is more relevant but it is not clear if/when there will be an update.
- To see if there are noticeable differences among rankings see Tables 1, World and Table 2. Asia for top ten rankings. There is a small core of universities that rise to the top regardless of the methodology. 32 universities are in the top ten in 13 rankings.
- My early background was as a Super Searcher (Basch). In drilling down into these rankings, it became obvious to me that functionality is an important aspect to the rankings that is not addressed. If the decision maker cannot see at least scores, then the rank is just a number. It is not information.
Decision makers should look beyond the overall rank and pick metrics from different sources to create their own ranking. The professional rankings are useful for benchmarking and time series comparisons. The ideal bibliometric source for individual institutions who can afford it is to use the raw data from Clarivate or Elsevier and create a ranking of their own for international use.
Other rankers, such as Scimago Institutions Rankings and Webometrics Ranking Web of Universities, which have updated their metrics and include social metrics, and UniRank, a popular website that ranks university popularity, will be covered in a later article.
Basch, R. (1993). Secrets of the super searchers. 8 Bit Books, pp. 141-149.
Gadd, E., Holmes, R., & Shearer, J. (2021). Developing a method for evaluating global university rankings. Scholarly Assessment Reports, 3(1), https://doi.org/10.29024/sar.31
Holmes, R. (7 Jan 2021). An indisputable ranking scorecard? Not really
IREG Observatory (2021) Approved international rankings. https://ireg-observatory.org/en/initiatives/ireg-inventory-of-international-rankings/
IREG Observatory (2021). CWUR rankings measure graduate and faculty quality.
IREG Observatory (2021). Eurasian University Rankings (IAAR-EUR) 2021.
Orduña-Malea, E. and Pérez-Esparrells (2021) for an in-depth analysis. Moscow International University Ranking: critical review and geopolitical effects. Profesional de la información, 30(2), e300209 at https://revista.profesionaldelainformacion.com/index.php/EPI/article/view/84502/62937
Ruth’s Rankings 47 Part 1 (June 2021) Hunting for new university ranking metrics, https://librarylearningspace.com/ruths-rankings-47-hunting-for-new-university-rankings-metrics/
Wan, C.D. (25 May 2021). The role of global rankings in recolonising universities. University World News, https://www.universityworldnews.com/post.php?story=2021052514311953
URLs for Rankings:
WURR: Ruth’s Rankings 47 Part 1: https://librarylearningspace.com/ruths-rankings-47-hunting-for-new-university-rankings-metrics/ with https://librarylearningspace.com/wp-content/uploads/2021/06/APPENDIX-A-WURR.pdf
Updates for other rankings, check Ruth’s Rankings list of articles and news updates https://librarylearningspace.com/list-ruths-rankings/
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory, she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674