By Ruth A. Pagell*
(4 December 2017) I am beginning to think of global university rankings as invasive species. Two years ago, a talk I gave in Singapore attracted an uninterested audience of 25. On this trip, over 200 librarians, faculty, administrators and students physically attended the talks.
- Students wanted to know how to use the rankings to choose a university
- Faculty were torn between how to get more citations and how to focus on teaching
- Administrators were caught in the middle, trying to explain the reality of the rankings to university and national leaders while trying to get faculty to cooperate, such as in registering for name disambiguation tools such as Orcid or Research ID
- Librarians wanted to know where they fit in. (Pagell, 25 Jan 2017)
My editor Clive Wing requested that I communicate with his friend in the Linguistics Department at City University Hong Kong (CityU) about university rankings. This resulted in my road trip to Hong Kong and Singapore. I contacted my colleague Diana Chan at The Hong Kong University of Science and Technology (HKUST) to get the name of the University librarian at CityU. In addition to the name, HKUST sponsored my trip to Hong Kong to talk about Ruth’s Ranking. Learning of the Hong Kong trip, my colleagues in Singapore arranged a similar talk. My presentation to HKUST is streamed here.
The entire work package included the following, all of which expanded my outlook on the topic:
- Ruth’s Rankings 30 – Hong Kong and Singapore – Is Success Sustainable? containing background information and data.
- Scholarly Communications seminar at HKUST, World University Rankings: How to be Number One, attracting over 80 librarians, faculty, administrators and students.
- Professional development workshop for HKUST library staff who got to create their own rankings, based on data from InCites, and to analyze the CWUR rankings.
- Conversation with HKUST’s Office of Institutional Research where I learned of the pressure placed on universities by their funding sources to participate in all rankings even if it means collecting data not normally collected. They also struggle to find ways to get faculty to provide data for the funding sources that are collected in-house.
- Informal talk with CityU’s Linguistics Department where we went off on a tangent to discuss the quality of the rankings asking such questions as: Which is best? Which do we believe? Which is most reliable? We concluded that given the different metrics and their weightings, results are arbitrary, and we need to focus on the individual metrics that are most relevant.
- Seminar jointly sponsored by the Library Association of Singapore and the National University of Singapore retitled World University Rankings: How to be Ranked. The over 100 attendees had a similar makeup to HKUST’s audience. Singapore’s new university landscape includes two world class universities plus four more autonomous public universities. I emphasized the rankings not in the popular three of QS, THE and ARWU.
While I was in Hong Kong, the pro-democracy Apple Daily broke a story on CityU’s under-reporting its student numbers to QS (and THE), thus improving its faculty-staff ratio. The English language ejinsight (2017) reported the story. An Apple reporter asked me to comment. I declined. The number in QS differed compared to the University Grants Commission data.
The discrepancies with the CityU data are not as important as the broader questions raised about how institutions can manipulate data or manipulate the results. Also during the trip, the BBC (Coughlin) reported on the UK’s Advertising Standards Authority citing six universities for false rankings claims. I fact-checked one. The University of Leicester claimed to be in the top 1% in the world. CWUR uses top percent and Leicester ranked 226 out of 1,000 universities listed with a 0.9% top in the world. Checking Webometrics which has over 26,000 tertiary institutions of ALL kinds, Leicester at 246 was comfortably within the 1% range. See Appendix 31 for the Advertising Standards Authority guidelines.
Issues and Answers
Attendees in both venues raised interesting issues.
- The business model for the rankers. I have dusted off my business librarian hat and am researching this for an upcoming article. Do universities pay to be ranked? is a related question. Universities may pay Elsevier or Clarivate Analytics for in-depth analysis of their performance. Universities pay QS to be analyzed to receive stars. (QS November 2017).
- The ascendance of NTU and the role of government funding. A recent article from Malaysia (Lim, 2017) highlighted the government’s inconsistent funding of its top universities, restoring some of the funding cuts discussed in Ruth’s Rankings 24 on Malaysian higher education.
- Collaboration and “international outlook”. I read that neither Clarivate Analytics nor SciVal counted collaboration between mainland China and Hong Kong as international. I have since discovered that it is true for CA but not SciVal. I will do a follow-up article in 2018 looking at how the different rankers handle collaborations and include the new ranking from PISA V5 (OECD, 2017) on collaborative problem-solving in pre-tertiary students.
- The role of Google Scholar and other open sites and what publications and citations count. Webometrics uses Google Scholar citations; the other rankings only include the publications in WOS or Scopus.
- Who ranks the rankers? IREG (International Ranking Expert Group) Observatory on Academic Rankings and Excellence is the official body providing guidelines for rankers. See the Berlin Principles in Appendix 31. Enserink (2007) presents an interesting analysis of the different rankers, especially ARWU and THE.
- Do the rankers’ methodologies change over time? Most rankings are consistent in their indicators but might make changes in calculations. THE World Rankings had two major changes. In 2011 it broke away from QS and in 2015-16 it changed the underlying data from WOS to Scopus. For time series analysis with one ranking, check the methodology for changes. When comparing your ranking across rankers, make sure to have read the underlying methodology for purpose and target audience.
- Small liberal arts colleges. These institutions are not excluded because they are “small liberal arts colleges” but because they do not meet the criteria to be ranked. They are integrated into the Wall Street Journal/Times Higher Education College Rankings which uses different metrics. Undergraduate students interested in studying in the U.S. should use this ranking. Webometrics includes over 26,000 tertiary institutions, arranged geographically and for the U.S, lists for colleges. Metrics include citations in Google Scholar.
Conclusion
I want to thank HKUST, LAS and NUS, and CityU for arranging these talks. The large turnout including librarians, faculty, administrators and students reflects the wide interest in the topic across higher education stakeholders. The mixed audience was as important as knowing how to be number one or how to get ranked. The librarians heard the concerns of their users beyond the narrow scope of the classroom. The librarians are the keepers of the Scopus and Web of Science subscriptions. Hopefully the talks opened a door to more collaboration among faculty, administration and students with the library taking on a role of providing resources and understanding of rankings’ metrics. “There is a role for the library in awakening to the power and potential of institutional metrics for research” (interview, 2014).
I also want to thank Maggie Lam, the Hong Kong representative from Clarivate Analytics and Sheeren Hanafi, Elsevier’s Head of Marketing of Research Intelligence.
Resources
CityU is said to have submitted inaccurate information to Quacquarelli Symonds to boost its rankings. (13 November 2017). Ejinsight on the pulse accessed at http://www.ejinsight.com/20171113-cityu-said-to-have-shrunk-student-numbers-to-boost-qs-rankings/.
Coughlan, Sean. (15 November (2017) Six universities told to change advertising claims. BBC News – http://ww.bbc.com/news/education-41984465.
Coughlin, Sean (15 November 2015) Why can’t a university claim to be in the top 1%? BBC News access at http://www.bbc.com/news/education-41996973.
Elsevier Research Intelligence (March 2015). Usage Guidebook Version 1.01 accessed at https://www.elsevier.com/data/assets/pdf_file/0007/53494/ERI-Usage-Guidebook-1.01-March-2015.pdf.
Enserink, Martin. (24 August 2007). Who rank’s the university rankers? Science 317. (5841)1026-1028. DOI: 10.1126/science.317.5841.1026.
Interview with Prema Arasu: A role for the library in awakening to the power and potential of institutional metrics for research. (4 December 2014). Library Connect accessed at https://libraryconnect.elsevier.com/articles/role-library-awakening-power-and-potential-institutional-metrics-research.
Lim, Ida (28 October 2017). After 2017’s 20pc cut, public universities’ operating budget goes up. Malaysia Mail Online, accessed at http://www.themalaymailonline.com/malaysia/article/after-2017s-20pc-cut-public-universities-operating-budget-goes-up#ig3qfUamBi90Ue7O.97
Pagell, Ruth A. (25 January 2017). Ruth’s Rankings 23: Are global Higher Education Rankings Flawed or Misunderstood? Access eNewsletter accessed at https://librarylearningspace.com/ruths-rankings-23-global-higher-education-rankings-flawed-misunderstood-personal-critique/. This article includes some recommendations.
QS Stars, explained in emails from Shiloh Rose and Jason Newman (17 October 2017 and 14 November 2017). Universities pay about US$41,000 for a QS audit. They may publish all or none of their results on TopUniversities.com. If they wish to use any of the badges to advertise their results they must agree to publish ALL results. “Essentially, we want to prevent universities from advertising their best results while hiding areas where they performed poorly.”
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A .Pagell is emeritus faculty librarian at Emory University. After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674.