Ruth’s Rankings 7: Asian Institutions Grow in Nature

By Ruth A. Pagell*

12 Asian institutions are in the top 50 worldwide and 3 Asian countries are in the top 10 for publishing articles in the “best” journals.

Ruth’s Rankings 7 and 8 follow up on previous rankings by looking at other methodologies that drill further down into scholarly output and provide different options for analysis. This month, we focus on the Nature Publishing Group that provides two different indexes.  I was especially interested in looking at the Nature rankings, since ARWU, which we examined in Ruth’s Rankings 6, uses publications in Nature as 20% of its weightings criteria.

Since 2008, Nature has been calculating an Asia-Pacific Index based on Nature Publishing Group’s own 18 publications (NPI) with a 19th added in February 2015.  In November 2014, Nature released its own first annual   Nature Index (NI)  using data from calendar year 2013.  This global index first appeared as a supplement to Nature and is available for free download (Campbell, N. & Grayson, M. (12 November 2014) Index 2014 Global, Nature 515, S49 – S108. There is a monthly update that covers a rolling 12 month period.  As of today, 2 February 2015, the date range is 1 December 2013 to 30 November 2014.  The Global Index is also available in print. The print version includes additional analysis for each of the regions covered.


The Nature Index is “a global indicator of high quality research”. It is a database of author affiliations and institutional relationships that are used to track contributions to articles published in 68 highly selective journals.  An independent group of active scientists selected the journals, based on where authors would want to see their best works published.

Methodology for Nature Index (See Table 1.)

NI uses three measures, all of which are based on number of articles published:

Article count  (AC): Each article is counted once for each author’s institution or country. If an article were to have 100 authors it  would be counted 100 times.

Fractional count (FC): All authors are assumed to have contributed equally. 1 is divided by number of authors.

Weighted fractional count (WFC):  A weighting is applied to the FC in order to adjust for the over-representation of papers from astronomy and astrophysics.

Only four fields are included:  Chemistry, Earth and Environmental sciences, Life Sciences, and Physical Sciences. Institutions that are strong in social sciences are under-represented.

The database provides rankings for institutions and countries. There is no one weighted score as in the rankings in articles Rankings 5 and 6, although NI uses WFC as its key score.   There are no fixed numbers of institutions or countries; and institutions include any organization for which there is one author affiliation.

Today (2 February 2015) there are two universities from our core  Asian countries in the top 10, 4 in the top 20 and 12 in the top 50 of NI.  This compares favorably with Europe, with 11 in the top 50.

The interface is interactive. In Institution Outputs, select a region or country, a specific scientific field and a metric of your choice.  For example you can generate a list of the 24 institutions in Thailand that published in the field of Life Sciences in the selected package of 68 journals during 1 December 2013 to 30 November 2014.  If you select Life Sciences, Mahidol University is number one in Thailand.  Change the field to Physical Sciences, and Chulalongkorn University is number one.

In country outputs, you get an overall score for the country and a breakdown by field.  You can also see the authors’ top collaborating countries.  For example, Thai authors collaborate most with the US and UK while Singaporean authors collaborate most with the US and China. The printed index further analyzes the collaboration data showing that only Thailand collaborates internationally above the global average (Figure 7.1)

I recently received an email from a librarian at a funding body asking advice on how to allocate money based on the metrics. My role is only to highlight different ways of measuring output and impact. The print Nature Index 2014 Global has a chart calculating the efficiency of institutions using WFC per 1,000 researchers (Figure 7.2).  Nature uses publicly available sources in the print analysis for funding levels, number of researchers, size of population etc. and is also incorporating altmetrics (S94)


NPI Asia-Pacific bases its weekly rankings on research articles, letters and brief communications and reviews in Nature and Nature monthly research journals.  In our discussion of metrics in the Ruth’s Rankings series, we talked about how to handle multiple authors with multiple affiliations.  Nature is very clear. NPI uses just two metrics:  number of articles (A) and Corrected Count (CC).

Each paper gets a total score of “1”.  It calculates percentages on a per author and per institution basis.  Assume an article with four authors from four different institutions.   Each author and her/his institution would get .25 points.  If one author listed two affiliations, each institution would receive .125 points.  The article’s one point is also distributed for the country rankings. (See Table 7.2 Methodology for NPI).

Using   the CC method in NPI, Suranaree University of Technology is number 1 in Physical Sciences in Thailand.  Two weeks ago, Chulalongkorn ranked highest on corrected count in Life Sciences, even though Mahidol has more articles.  In the two week period, new rankings were distributed and Mahidol is number one in Thailand again. Using the CC method (preferred over number of articles by Nature) Kasetsart University is ranked number one overall in Thailand.

NPI is dealing with a much smaller universe of authors than other rankers or data providers such as Thomson Reuters or Scopus.  Thomson Reuters and Scopus have millions of authors and thousands of publications.  Nature is also not concerned about filtering for high quality journals. 10 Nature publications are in the top 20 in impact in Journal Citation Reports and 10 are in the top 30 in SJR (which we will examine as part of SCImago next month.)

There is a print supplement for the 2013 Asia-Pacific Publishing Index. One chapter of special interest (pg 30) includes a regional roundup, focusing on those countries not in the top five in the Asia-Pacific Region (the top five being China, Japan, Singapore, South Korea and Australia). 


In comparing both Nature rankings, it is important to check the date.  The date range for the article base of the Global 2014 index is 2013.   The date range for the online Institutional Rankings  is 1 December 2013 to 30 November 2014 and the date range for the latest Asian weekly rankings (as of 2 February 2015) is 3 February 2014 to 2 February 2015.  The dates and therefore the rankings changed slightly as I was writing this article.

I had expected our core Asian countries to fare better in the Nature rankings, despite the fact that the rankings include only a small elite group of publications.  The number and percent of publications from leading Asian research countries has risen over 100% in 10 years, except for Japan which has seen a slight drop. To further verify this document trend, I did a general search on these countries in Web of Science:  China, including Hong Kong, South Korea, Singapore, Taiwan, India, Malaysia, and Thailand.  The percentage of articles from these countries is rising in Web of Science and is now over 26% of all publications in WOS for 2014, and an increase of over 50% in the past 10 years.  Asian countries therefore should do better in all rankings going forward as they become a larger percentage of the overall article dataset.  Figure 7.3 illustrates the change in number and percent of articles from Asia in NPI.

There are four tables of rankings:

Table 7.3: NI top 20 institutions in the world from the 2014 print rankings (using 2013) data and the most current online rankings.

Table 7.4: Comparison of NI universities with the top ten rankings from ARWU and NTU-Taiwan.

Table 7.5: Includes top 20 institution rankings from NPI with their Asian and world rankings

Table 7.6: Compares the Asian university rankings from the three sources in 7.4, using both WF and CC for Nature


Many of you are not affiliated with institutions that publish even infrequently in these journals.  On the other hand, some of you are affiliated with institutions that may appear in these rankings and no other scholarly rankings.  The Nature rankings are worth exploring by everyone. The limited number of journals, metrics, and years make this a microcosm of the rankings we have visited before and the rankings that will follow.  There is only one annual global ranking, using three metrics, all based on article count; and the Asia-Pacific ranking which has only two metrics.  The number of journals is limited and fixed.  The number of institutions, countries and even rank can change on a weekly basis.  This clearly illustrates the importance of analyzing the data on a macro level rather than reacting to every change of one or two points in a ranking,   The comparative NPI Table 7.5  illustrates the differences that occur in rankings over a short period of time.  This supports our message:  make sure you understand the underlying data, the impact of the publications package, the way of counting and the date range.

1 Only East and Southeast Asia is included.  Research was conducted from 20 January to 2 February 2015, and rankings changed over the time period.

Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A Look Back Before We Move Forward
  14. SciVal – Elsevier’s research intelligence –  Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?
  29. From Bibliometrics to Geopolitics:  An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
  30. Hong Kong and Singapore: Is Success Sustainable?
  31. Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
  32. The Business of Rankings – Show me the money
  33. Authors:  People and processes
  34. Authors: Part 2 – Who are you? 
  35. Come together:  May updates lead to an investigation of Collaboration
  36. Innovation, Automation, and Technology Part 1:  From Scholarly Articles to Patents;     Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
  37. How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation MetricsPart 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
  38. Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
  39. Business School Rankings: Monkey Business for an Asia/Pac audience
  40. Deconstructing QS Subjects and Surveys
  41. THE’s University Impact Rankings and Sustainable Development Goals: Are these the most impactful universities in the world?
  42. ASEAN – a special analysis of ASEAN nations
  43. Predatory practices revisited – misunderstandings and positive actions

*Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.   Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS.