PART ONE: The need for new metrics and an introduction to World University Research Rankings (WURR) with its new metric
PART TWO: Overviews of IREG approved rankings not covered by Ruth’s Rankings: URAP, CWUR, UniRank, and rankings with changes, emphasizing unique metrics, outcomes, and comparisons (coming soon)
(31 May 2021) Everyone loves rankings, whether they are for the best Indian restaurant in Singapore, the most bike-friendly city in the U.S., the best beach in Asia, or the top university in the world. Each ranking has its methodology and its supporters and detractors. Ranks by “experts” or from data differ from rankings by the public.
What does this have to do with our new and existing university rankings? Rankings of universities or individual departments go back to the end of the 19th century (Pagell, 2014). Undergrads and their parents, graduate students, junior or senior researchers, university administrators, and government higher education departments all use rankings for evaluating a university as a place to study or work, and for bragging rights, benchmarking, and policy decisions. Today’s rankings grew out of the need for evaluating universities on teaching and research as higher education became an international industry. Number ones will differ among rankings. The question should not be which is right or wrong, but which best fits the needs of the different user groups.
A January webinar sponsored by CIHE, Boston College’s Center for International Higher Education, asked the question “Are rankings still fit for purpose?” Ellen Hazelkorn and Alex Usher, experts in university rankings and higher education were the presenters.
Ellen Hazelkorn emphasized geopolitical issues such as the decline of US and Europe, increasing inequalities, changes worldwide, shifts in labor market, nationalism, and the effects of COVID. She pointed out the limited number of universities covered in most rankings and reiterated the concerns about the Journal Impact Factor and too much emphasis on research. She recognized a need to measure teaching (Hazelkorn, 2017,2019; Pagell, 2019).
Alex Usher focused on the ranking and said that the international rankings were fit for the purposes for which they were designed almost 20 years ago. He praised THE for its tackling of SDGs, which is still a work in progress. He noted that purpose has moved on, but metrics have not kept up and may never catch up. He ended by emphasizing the importance of data. Another takeaway from the discussion is that 21st century rankings are still based on the 20th century roles of the university.
IREG Observatory on Academic Ranking and Excellence lists the rankings meeting its standards and has published a second edition of detailed information on each ranking in book format (Siwinski, Holmes, and Kopanska). IREG is also rethinking the rankings and creating a new tool to rate the rankers. It is based on strengths and weaknesses using four criteria, including measuring what matters Only six rankers are included in this pilot: THE, QS, ARWU, US News Global, Leiden, and U-Multirank. “The rankers did not perform well with regard to measuring what matters although U-Multirank and Leiden Ranking did fairly well for measuring against missions” (IREG Oct 2020). Adding my personal note, U-Multirank might be measuring what matters, but it has limited impact since most universities outside Europe do not supply the university data.
Hazelkorn and Usher are not the only ones concerned about current rankings and higher education. While Hazelkorn is interested in metrics beyond research, much of the discussion still focuses on research topics.
Covid’s impact on higher education led to THE’s survey of six academics about reforms needed in higher education. One respondent discussed the increased use of social media and asked how it would be measured to evaluate new media scholarship (Chambers).
Another noticeable change is more emphasis on collaboration rather than internationalisation. Several rankers include International Collaboration, but it is only important in CWTS Leiden, which includes local (<100K) and industry collaborations as well.
Ruth’s Rankings 35 covers CWTS Leiden’s Collaboration Indicators. Baker highlights the value of collaboration among universities in close proximity to each other, for example Harvard and MIT.
THE hosted a conference in South Africa highlighting the importance of collaboration among African universities to avoid the dominance by South African universities in the rankings. It recognized the need for institutions to reimagine their role in African societies (McKie). Other recent reports on collaboration include an example of Japanese and South Korean scientists working together on a project. Paterson takes the need for scientific collaboration up a level to greater collaboration among nations. According to Fuyuno, “Despite political tensions, research ties are resilient.”
WORLD UNIVERSITY RESEARCH RANKINGS – WURR
WURR fits into the discussions on changing roles of universities at a time when the issues facing the world cannot be compartmentalized into single categories or locations. Although it has not yet been approved by IREG, it was reviewed by IREG (Sep 2020). The ranking is also in response to the ongoing concerns about current ranking methodologies, especially the use of surveys for measuring quality.
Researchers at Singapore University of Technology and Design, in collaboration with World Scientific Publishing (Singapore), released WURR in late 2020. According to the developers “It is timely to relook at how universities’ research performance are assessed with respect to how they are able to harness the synergies created from bringing together multi-disciplinary and collaborative teams to tackle the research problems in addition to sustaining research impact (Yeo, Zhang, and Chang, pg:3)
Multidisciplinary is the unique aspect of this ranking. The concept has a variety of definitions (NC State). WURR defines multidisciplinary as a publication classified in two or more categories or fields using OECD Schema. See Appendix 47.A for more detail.
Italian researchers, led by Abramo and D’Angelo (2018, 2021), have been studying multi- or inter-disciplinary publications. Elsevier’s searches for the UN’s Sustainable Development Goals (SDGs), developed for THE’s Impact Rankings, include search terms from multiple disciplines. (Jayabalasingham, et.al.). Elsevier’ SciVal provides the capability to use one of their multi-disciplinary topics or define one of your own. SciVal data include the percent in different subject categories and types of collaborations as shown in Example 47.1.
The dataset is the top 250 top universities in QS’ 2020 World University Rankings. This is to demonstrate the difference in ranks for the same set of institutions using the WURR methodology. Table 47: 2a below compares the overall rank for the top five in both rankings. See Table 47:2b for the top 20.
WURR has three components, Research impact, Collaborative-ness and the new metric, Multidisciplinary research, and seven metrics. See Appendix 47: A for a complete list of metrics.
WURR uses only third party data derived from Clarivate. Data cover a ten year time period, 2009-2018. The seven indicators are weighted equally at 1/7. Collaborative-ness and Multidisciplinary are worth a little less than 30% each and Research Impact is worth a little more than 40%. All the indicators are size independent. Criticisms of existing rankings are that the metrics are size dependent. Larger universities with more publications and citations rise to the top. All WURR metrics are size independent.
- Which university is the most impactful for international collaboration?
|University A: 24,000 documents international collaborations; 83% of all documents; Rank-3
University B: 128,000 documents international collaborations; 39% of all documents; Rank 84
From Incites 5/5/2021, using WURR’s time period of 2009-2018
Research Collaborative-ness: WURR’s Collaborative-ness scores are calculated with equal weightings for internation and industry. I decided to drill down into the Leiden rankings to get more details. Leiden has no composite ranking but individual rankings for the two categories. Only 43 universities out of 1171 had over 10% industry collaboration while only 13 had less than 10% for international collaborations (CWTS Leiden 2020 rankings). Given the discrepancy between the underlying data for collaborativeness, the equal weightings of the two categories are questionable.
Regional distribution of the dataset include 108 universities from Europe followed by North America with 65 and Asia with 51. Seven European universities are in the top 10 overall; 5 Asia/Pac are in the top 10 for Multidisciplinary, all 10 in Impact are from the US and all 10 in International Collaborativeness are from Europe. See Table 47.3 for the top 20 for each metric. In most rankings, top universities tend to rank at the top across all metrics. As we see in Table 47.3 that is not the case for WURR. Each of the three categories are capturing different aspects of a university’s performance. Advantages and disadvantages are in Appendix A.
WURR adds a new metric to the rankings toolbox. The complexity of 21st century issues requires research bringing together multiple disciplines. The first limited edition of WURR has demonstrated that using its set of metrics produced results that differ from not only QS but other traditional rankings as well. A second iteration, with 500 universities, is in the planning stage. There is no information right now on how universities will be selected and what, if any, new metrics will be added.
Abramo, G, D’Angelo, C.A., & Costa, F. (July 2018). The effect of multidisciplinary collaborations on research diversification. Scientometrics, https://doi.org/10.1007/s11192-018-2746-2
Abramo, G., D’Angelo, C.A., & Zhang, L. (March 2021). A comparison of two approaches for measuring interdisciplinary research output: the disciplinary diversity of authors vs the disciplinary diversity of the reference list. https://arxiv.org/abs/2103.14856
Baker, S. (25 March 2021). Universities as neighbours: close collaborators or fierce rivals? Some of the world’s leading universities appear to benefit from having another top-ranked institution in their backyard. Times Higher Education, accessed at https://www.timeshighereducation.com/news/universities-neighbours-close-collaborators-or-fierce-rivals
Chambers, C. et. al. (18 Feb 2021). Moving mountains: The reforms that would push academia to new heights. Times Higher Education accessed at https://www.timeshighereducation.com/features/moving-mountains-reforms-would-push-academia-new-heights
CIHE, Center for Higher Education Boston College (25 Jan 2021). Are rankings still fit for purpose, access at https://www.youtube.com/watch?v=ZyGjRdEht00
Fuyuno, I. (18 March 2021). Japan and South Korea pursue shared interests: Despite political tensions, research ties are resilient. Nature index News Blog, accessed at https://www.natureindex.com/news-blog/japan-and-south-korea-pursue-shared-research-science-interests
Hazelkorn, E., editor. (2017). Global rankings and the geopolitics of higher education.: Understanding the influence and impact of rankings on higher education policy and society. Routledge (London and New York).
Hazelkorn, E. (19 Sep 2019). The ‘best universities in the world: Can global university systems identify quality education? WENR, posted by E. Roach at https://wenr.wes.org/2019/09/the-best-universities-in-the-world-can-global-university-ranking-systems-identify-quality-education/
IREG (22 Sep 2020). World University Research Rankings: Europe Excels for multidisciplinary and collaborative research. IREG Observatory Ranking News accessed at https://ireg-observatory.org/en/bez-kategorii/world-university-research-rankings-europe-excels-for-multidisciplinary-and-collaborative-research/
IREG (27 Oct 2020). Rethinking the rankings: The development of a tool for rating rankings. https://ireg-observatory.org/en/ranking-news/rethinking-the-rankings-the-development-of-a-tool-for-rating-rankings/
Jayabalasingham, B.et. al (2019), “Identifying research supporting the United Nations Sustainable Development Goals”, Mendeley Data, V1, doi: 10.17632/87txkw7khs.1, or access at https://data.mendeley.com/datasets/87txkw7khs/1
McKie, A. (11 Mar 2021). Collaboration ‘key’ for African universities to share excellence. Times Higher Education Rankings accessed at https://www.timeshighereducation.com/news/collaboration-key-african-universities-share-excellence
NC State Research Development Office (Aug 2020). The difference between [sic] multidisciplinary, interdisciplinary, and convergent. https://research.ncsu.edu/rdo/2020/08/the-difference-between-multidisciplinary-interdisciplinary-and-convergence-research/
Pagell, R. (8 August 2014). Ruth’s Rankings 2: A brief history of rankings and higher education policy, accessed at https://librarylearningspace.com/ruths-rankings-2-brief-history-rankings-higher-education-policy/
Pagell, R. (1 Sep 2019). Ruth’s Rankings 29 From bibliometrics to geopolitics: An overview of global rankings and the geopolitics of higher education, edited by Ellen Hazelkorn, https://librarylearningspace.com/ruths-rankings-29-bibliometrics-geopolitics-overview-global-rankings-geopolitics-higher-education-edited-ellen-hazelkorn/
Pagell, R. (11 Jun 2018). Ruth’s Rankings 35: Come together: May updates lead to an investigation of collaboration https://librarylearningspace.com/ruths-rankings-35-come-together-may-updates-lead-investigation-collaboration/
Paterson, M. (2 April 2021). Global vaccine divide – More science collaboration needed. University World News Global Edition accessed at https://www.universityworldnews.com/post.php?story=20210402151906225
Siwinski, W., Holmes, R, & Kopanska, J. (18 March 2021). Inventory of international rankings (18 March 2021) https://ireg-observatory.org/en/wp-content/uploads/2021/03/IREG-Inventory-2021-final-report-2021-03-19.pdf
University Rankings Guide (12 Feb 2021). A closer look for research leaders. Elsevier, access at https://www.elsevier.com/__data/assets/pdf_file/0017/1113920/university-rankings-guide.pdf
World University Research Rankings Launch (Sep 2020) access at https://www.youtube.com/watch?v=y9wwsrgc_6o
Yeo, K.S., Yang Z., and Chan, A, (2021). Research assessment framework for global universities 2020, World Scientific Publishers, Singapore,2021 https://worldscientific.com/worldscibooks/10.1142/1207 (book must be purchased)
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory, she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674