Ruth’s Rankings 5. Comparing Times Higher Education (THE) and QS Rankings

By Ruth A. Pagell*

(26 November 2014) Which university is #1 in the world in 2014/2015?  Massachusetts Institute of Technology?  California Institute of Technology?  Harvard University? Cambridge University? Princeton?

In Asia? University of Tokyo?  National University of Singapore?  University of Hong Kong? 1

Read on and find out who the top universities are and why they are tops.

Ruth’s Rankings 1 to 4 introduced institutional research rankings.  We focused on the importance of policy on country and institutional performance; the key bibliometrics used in the ranking; and the two companies that provide the bibliometric data, Thomson Reuters and Elsevier. This month we drill down into the rankings themselves, starting with the two widely publicized commercial rankings: THE (Times Higher Education) World University Rankings, using Thomson Reuters data, and QS (Quacquarelli Symonds) Top World Universities, using data from Elsevier’s Scopus. THE and QS differ from most of the other rankings we will examine as parents and students have always been a key part of their target audiences.

The History of THE and QS Rankings

Shanghai Jiao Tong University initiated international rankings of universities with Academic Ranking of World Universities in 2003, based completely on research indicators. We will examine ARWU and other ranking services in following articles.

Also in 2003, Lambert’s Review of Business-University Collaboration in the UK recommended the compilation of international university rankings.  By November 2004, The Times Higher Education Supplement (“US dominates worldwide league tables” November 5, No 1665) released The Times Higher-QS World University Rankings to sample the views of academics across the world. The first rankings surveyed “1,300 academics in 88 countries, spanning the academic disciplines.”  The methodology also included: number of citations for research papers, high staff-to-student ratios and the number of students and staff recruited from overseas.


From 2004 to 2009, QS collected and compiled the data, creating the rankings tables that the Times Higher Education Supplement printed.  For the first three years, QS used data from Thomson Reuters “Essential Science Indicators”.  In 2007, the methodology changed.2 SCOPUS data, using a larger publications base, replaced ESI 3.  Following the 2009 rankings, THE split with QS, returned to Thomson Reuters data and began working closely with Thomson Reuters on the Global Institutional Profiles Project (GIPP) to improve individual institutional profile data.  Today both THE and QS release their own rankings with their own methodologies. Some commonalities between methodologies still exist as shown below.  They both use scoring systems rather than presenting the underlying data.

In late November 2014, THE announced revisions to its methodology, starting with the 2015/2016 World Rankings.  Institutional data will be brought in-house meaning that GIPP will no longer be used  and all research data will once again be drawn from Elsevier’s SCOPUS database.


Comparison of World Ranking Methodologies

Comparison of World Ranking Methodologies


QS Quacquarelli Symonds  is a British consultancy that is a “global provider of specialist higher education and careers information and solutions.”  The QS World University Rankings® aims “to help students make informed comparisons between their international study options.”

QS bases 50% of its rankings on two surveys: the results of a survey of over 63,000 academics worldwide, spanning a variety of disciplines combined with the opinions of about 28,000 international graduate employers.  See Table 1. QS Methodology for a complete list of QS metrics and their weightings.

QS ranks five broad faculties and 25 specific subjects which can be further filtered by region and country.  Asia, Latin America, and BRICS (Brazil, Russia, India, China and South Africa) have their own rankings using different weightings of metrics.  Check information under Intelligence Unit for more in-depth information about the methodology.  Unique to QS is a pilot 2014 ranking of the top 50 universities in the Arab Region.

You must register to get access to all the QS features.


THE’s first rankings without QS appeared in 2010/2011. It ranks 400 universities, assigning scores to the top 200, based on their willingness to participate.  See Table 2. THE Methodology for THE methodology. THE also ranks by broad subject category using different weightings.  Teaching accounts for 37.5% of the score for Arts & Humanities and 20% of the score for Engineering and Technology.

THE’s ranking of the Top 100 Asian Universities appears after the World Rankings. Forty six additional universities were included in the 2014 ranking, expanding country coverage. THE has a special supplement for the Asian Rankings with ratings for each category as shown in Figure 1.

THE Asia Ratings by Category

Figure 1. THE Asia Ratings by Category

There are also separate rankings for the Top 100 BRICs and Emerging Economies, further expanding coverage of institutions and countries beyond the initial four hundred.


The focus of this series of articles has been bibliometrics.  Bibliometrics are not the major component of the QS and THE rankings.  WOS papers and citations make up 36% of THE world rankings and vary by subject.

Arts & Humanities  18.8%

Clinical, Preclinical, Health, Life and Physical Sciences  39.1%

Engineering & Technology  32%

Social Sciences  29.9%

THE extracts data from the 12,000 academic journals indexed by the WOS database. For 2014/2015, citation counts include indexed journals published between 2008 and 2012 and use citations between 2008 and 2013.  The scores are normalized to account for variations in subject areas.

QS uses the latest five complete years of SCOPUS data and the total citation count. For their world rankings, rather than using total citations the number reported is citations divided by the number of academic faculty members at the university.  Larger institutions, therefore, do not have an unfair advantage.  For its regional rankings, QS uses different bibliometrics: citations per papers published in SCOPUS and papers per faculty.   Subject data use the two surveys, citations per paper and an h-index.  Weightings vary by individual subject. See the QS Supplement on World Rankings by Subject to see individual scores for each subject area (registration is required to view all the features.)  For example, as seen in Figure 2, Asian universities do well internationally in engineering categories with five Asian universities in the top 12 for Civil Engineering.

QS Subject Civil Engineering

Figure 2. QS Subject Civil Engineering

For articles with multiple authors, both THE and QS give full weight to each individual author and institution.


Harvard, Oxford and Cambridge are in the top five of both rankings.  California Institute of Technology (Cal Tech) and Stanford round out THE’s top five while Massachusetts Institute of Technology (MIT), Imperial College and University College of London (UCL) round out the top five for QS.  Nine out of the top schools were the same on both lists.  For comparison, nine out of the 2004 THE-QS list are still in the THE top ten and eight are in the QS top ten.

Both QS and THE use different weightings for their Asian ranking. The order of institutions in their 2014 Asian Rankings is NOT the same as the order for the same institutions in the 2014/2015 world lists.  See Table 3. THE QS World Rankings for THE and QS  world rankings and Table 4. THE and QS Top Asian Universities 2014-2015 for the Asian Rankings.

Geographical coverage is expanding and shifting. For 2014/2015 THE has 28 countries ranked in the top 200, with over half of them universities from the US.  QS has 31 countries in its top 200 with only 25% from the US.


Most top universities remain top universities but the order may change.  How much difference does the order make?  As we can see from the various tables, order may change based on different weights and different categories within the overall rankings.

QS and THE list universities by their rank, based on the underlying scoring systems. Table 5. Score Differentials shows the importance of checking underlying scores to get a better understanding of what it means to be one or 100. It shows scores and the percent of separation from 1st to 100th. For example, only 3.5% separate the scores for the QS top ten and 7.2% for THE’s top 10. THE’s top ten had differentials of 10 or fewer points for all categories except international impact. Twenty US universities in the top 200 North American schools have a score over 70 in this category.  Two UK universities in the top 200 European list have scores below 70.  The higher concentration of international students and faculty help European universities in the rankings.

Universities concerned about their world rankings should shop around. Look at the special regional rankings which use different metrics and compare rankings between the services.  A country such as Malaysia, which is not listed in THE’s World or Asian rankings does have a university included in THE’s BRICS and Emerging Economies.  Seven Malaysian universities are in the QS 800 list.

Universities also need to look at the results of the reputation rankings and their impact on overall rankings. Note in Figure 3 the over 70% drop in score between Harvard and #11 University of Tokyo. (

THE World Reputation Rankings 2014

Figure 3. THE World Reputation Rankings 2014

Focusing efforts on improving research quality alone is not the only key to success in these two rankings

Next month we look at two Asian-based scholarly rankings.


1.  Who is number 1 in the world?  MIT in QS2014/2015 World Rankings; Cal Tech in THE 2014/1015 World Rankings;  Harvard in THE Reputation Rankings; Cambridge in QS Reputation Rankings; Princeton  in THE Physical Sciences

Who is number 1 in Asia?  NUS in QS 2014 Asian rankings; Tokyo in THE 2014 Asian rankings; University of Hong Kong in THE Social Sciences

2.  Ince, M.  2007. Ideas without borders as excellence goes global
The Times Higher Education Supplement World University Rankings 2007, pp. 2007 2.

3. Pagell, R. A. 2014.   Insights into Incites:  Journal Citation Reports and Essential Science Indicators.  Online Searcher, Vol. 38, No. 6, pp. 16-19.

Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A Look Back Before We Move Forward
  14. SciVal – Elsevier’s research intelligence –  Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?
  29. From Bibliometrics to Geopolitics:  An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
  30. Hong Kong and Singapore: Is Success Sustainable?
  31. Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
  32. The Business of Rankings – Show me the money
  33. Authors:  People and processes
  34. Authors: Part 2 – Who are you? 
  35. Come together:  May updates lead to an investigation of Collaboration
  36. Innovation, Automation, and Technology Part 1:  From Scholarly Articles to Patents;     Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
  37. How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation MetricsPart 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
  38. Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
  39. Business School Rankings: Monkey Business for an Asia/Pac audience
  40. Deconstructing QS Subjects and Surveys
  41. THE’s University Impact Rankings and Sustainable Development Goals: Are these the most impactful universities in the world?
  42. ASEAN – a special analysis of ASEAN nations
  43. Predatory practices revisited – misunderstandings and positive actions

*Ruth A. Pagell is currently teaching in the Library and Information Science Program at the University of Hawaii.   Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS.