Ruth’s Rankings 40: Deconstructing QS Subjects and Surveys

By Ruth A. Pagell*

(12 April 2019) February marked the end of 2019 for the QS rankings with the release of its subject rankings. The list below reviews the results dated 2019

I planned a short news flash updating the subject rankings, but email requests led me to look at the QS Academic and Employer surveys and Graduate Employability rankings. My conclusion, which I know is out of place, is to take time to carefully analyze the tables in this article and online. Since we are not privy to the underlying data, it is important to:

  • Look at the scores and their differentials. The top 10 scores for Academic Reputation are separated by 0.1 points.
  • Re-rank according to metrics that are important to you.
  • Check the methodology for indicator definitions.  QS uses three different bibliometrics; citations per faculty, citations per paper, and papers per faculty.
  • Check the methodology for indicator weightings, which are recalibrated by ranking as QS notes in the quote below:

QS quote

QS SUBJECT RANKINGS 2019

QS Subject Rankings 2019 cover five broad categories, 48 subjects, 78 locations, with 1,222 institutions from a survey of 112,500 departments. 160 institutions are listed for the first time. U.S. universities hold 28 of the top positions followed by 13 for the U.K. No Asian university is in a number one position. Top ranked schools dominate most top subject lists.  New names appear in niche subject areas.

See Table 40.1 for individual Asia/Pac country representation.15 Asia/Pac countries have at least one school ranked in a major subject category and ten countries have schools ranked in the top 100. Japan has the most universities in QS World rankings, China has the most in the subject rankings and Australia leads with universities in the top 100.

Table_40.1._Asia-Pac_countries_in_subjects

The QS news release lists the top school in the 48 subjects. The methodology uses four indicators:  Academic and Employer reputation surveys to be discussed below, Research Citations per paper over a five year period and H-Index. Weightings vary by subject.  Niche subjects rely on surveys.  Performing Arts uses 100% surveys; in Dentistry surveys are 40%. See Table 40.2 (in pdf) for indicators and top universities for the five categories and selected subjects.

Example 40.1:  Value Add – Rank it yourself:  Download the spreadsheet with data for all 48 subjects.  For example, my field is Library Science with QS weightings of 70% for Academic reputation, 5% for Employer reputation, 15 % for Citations and 10% for H-Index. University of British Columbia is number one. I can re-rank on individual metric or change the weightings to better reflect my institution’s priorities or to get a better overall score. Click here for the Library & Information Management spreadsheet with my rankings and the U.S. News rankings. Change the weights and re-rank for yourself or try this with another subject.

QS Subject Summary and Observations: According to QS, students say that subject rankings are the most useful rankings for them. I have recommended subject rankings for universities that are not strong in highly cited categories. Since QS uses not only different weightings but subject specific reputation scores, lesser known universities appear on subject specific lists.

REPUTATION

Reputation surveys play a major role in both the QS and THE rankings.  A new organization, the World 100 Reputation Network is “Managing the reputations of the world’s leading universities.” It currently has 51 members from 12 countries. 32 members are European with 18 from the UK 10 are from Asia/Pac. Seven are from Canada and two from the U.S.

Qs reputation surveys

Ruth’s Rankings 23  looked at QS surveys. Ruth’s Rankings 27  compared THE’s and QS’ Academic reputation surveys.  Researchers have expressed concern about the surveys (Marginson, 2014; Jaschik, 2018). QS bases 40% of its world rankings on the Academic Reputation survey and another 10% on the Employer survey. The surveys are used in all of the QS rankings. Table 40.3 has the percent of different weightings.

Table_40.3

I have not tracked the reputation surveys on a regular basis. An email I received in January (Sample 40.1, in pdf) along with the variations of weighting in the Subject surveys led me back down the reputation path.

Academic reputation survey

QS first compiled its Academic Reputation Survey in 2004. To learn more about background on the survey and how QS promotes it, click on Summary information.

QS published QS Academic Reputation Insights: 2019 Regional Breakdown in February with reports for Europe, Latin America, Middle East and Africa and Asia and the Pacific. I read the QS Analytics: Academic Reputation: Spotlight on Asia and the Pacific

Australia, China and Japan have the most nominations. They also have the most ranked universities in the region.  For the past eight years, The University of Tokyo’s Academic Reputation ranking has remained at seven in the world despite the ups and downs of its overall ranking from a low of 39 to a high of 23.

Employer reputation survey

National rankings often include an institution’s graduation rate. Business school rankings have metrics on time to get a job and salaries. This QS survey and its companion Graduate Employability rankings are unique. They may not be of interest to scholars, but they are important for students and university recruiters.

The Employer survey began in 1990 as part of MBA rankings. In 2005 it became part of the world rankings.  Employers are asked to list up to ten domestic institutions they consider best for recruitment and 30 institutions from the region(s) with which they have expressed familiarity. To learn more about the survey, click Summary information.

Conclusion for Reputation Surveys

A survey of professional school deans ranking the top five in their fields was published in 1974 (Blau and Margulies, 1974) to much criticism. This has morphed into reputation surveys.  Academics and employers from around the world, selected from lists and recommendations get to vote in the QS surveys. Is the voting based on performance or image?  I have expressed my skepticism about the relationship between reputation surveys and reality since these are opinion indicators. However, there is a high correlation (r=.8) between the QS Academic reputation survey and the QS university rankings. It is not surprising since the reputation survey has the highest weighting of all indicators. See Table 40.4 (in pdf): 2019 Asian Universities Rankings for Reputations and Research.

GRADUATE EMPLOYABILITY RANKINGS

QS started this ranking in 2015 after concluding that the standard measurement, graduate employment rate, did not work for international rankings. The 2019 survey examined 650 universities and ranked 500 with 41 new entries. 42,000 employers participated in the survey. Topping the list is MIT, also number one in QS World rankings.  Number one in Asia/Pac is the University of Sydney. Over a quarter of the institutions are from Asia/Pac. China leads with 26 followed by Australia with 18, South Korea with 16, Japan with 14 and India with 13.  I also received a request to fill in this survey.  Click here to see instructions.

Methodology

The indicators in this ranking are listed below with first in world and Asia/Pac. ’Partnership with Employers by Faculty’ has a bibliometric component.  See Table 40.5 (in pdf) for the top ten universities for each indicator and for expanded indicator definitions.

  • Employer reputation – 30%:  Cambridge and Peking at 12.
  • Alumni outcomes – 25%high achievers”: Cambridge and Seoul National University at 15
  • Partnerships with employers per faculty – 25%: uses Scopus data to identify faculty/corporate publications and work placement partnerships. Stanford followed by Zhejiang second.
  • Employer/Student connections – 10%: individual employers who have been on campus in the past 12 months. Six of the top 10 are from China with Huazhong University of Science and Technology number one.
  • Graduate employment rate – 10%: job-seeking students employed after 12 months.  Moscow State Institute of International relations; Taylor’s (MY) is fourth.

CONCLUSION

QS and THE continue to introduce new rankings and universities continue to search for more ways to promote themselves. There is a positive relationship between academic reputation and overall ranking. Since we do not see the underlying data, we do not know the real differences among groups of schools.  Changes in weightings change rankings.

The new rankings expand the definition of what makes a top global university and move us further away from our roots in bibliometrics. Other countries are slowly chipping away at U.S. leadership, China continues to rise, Japan continues to stagnate, and we all seem to become more addicted to rankings.

The next new rankings, THE’s Innovation and Impact, should be released in April.

References

Blau, P.M. & Margulies (1974).  A research replication: The reputations of American professional schools.  Change, 6 (10), 42-47. Doi: https://doi.org/10.1080/00091383.1974.10568796  with first page; full text on JSTOR with subscription

Jaschik, S. (19 Feb 2018).  Who votes in QS Rankings? People who are not academics are invited to vote.  QS says it weeds them out after they do so.  Inside Higher Education, accessed at https://www.insidehighered.com/admissions/article/2018/02/19/qs-admits-people-who-dont-fit-criteria-vote-its-rankings

Marginson, S. (2014).  University rankings and social science.  European Journal of Education, 49 (1) see page 8. Accessed at: http://repositorio.minedu.gob.pe/bitstream/handle/123456789/2905/University%20Rankings%20and%20Social%20Science.pdf?sequence=1

Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A Look Back Before We Move Forward
  14. SciVal – Elsevier’s research intelligence – Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?
  29. From Bibliometrics to Geopolitics:  An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
  30. Hong Kong and Singapore: Is Success Sustainable?
  31. Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
  32. The Business of Rankings – Show me the money
  33. Authors:  Part 1 – People and processes
  34. Authors: Part 2 – Who are you?
  35. Come together:  May updates lead to an investigation of Collaboration 
  36. Innovation, Automation, and Technology Part 1:  From Scholarly Articles to Patents;     Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
  37. How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
  38. Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
  39. Business School Rankings: Monkey Business for an Asia/Pac audience
  40. Deconstructing QS Subjects and Surveys

 

*Ruth A. Pagell is emeritus faculty librarian at Emory University.  After working at Emory, she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674