By Ruth A. Pagell*
(11 June 2018) The following three international ranking updates were released in May.
- 2018 CWTS Leiden Rankings, where I will focus on the Collaborations module
- 2018 Times Higher Education, with its ever growing list of Emerging Economies
- 2018 U21 National Higher Education system rankings, drilling down in “Connectivity”
In addition, Malaysia updated and revised its internal university rankings, SETARA, an update to Ruth’s Rankings 24.
In preparation for the May release of the 2018 Leiden rankings, Waltman and van Eck’s CWTS blog (30 April 2018) reports on how users access their rankings. A more complete article is available in arXiv. CWTS rankings have two modules, Impact and Collaboration. The authors report that only 7.2% of website visitors use collaboration. Following up on Ruth’s Rankings 33 and 34 on authors, where we noted the increase in collaboration and its growing importance in rankings, I am featuring Leiden’s collaboration module and other rankers handling of collaboration.
According to CWTS “there is no best indicator for ranking universities.” (Waltman & van Eck, 11 April 2018). Since most readers want traditional rankings, see Table 35.1 for the top universities in 2018 for Impact.
Collaboration has three indicators: total number of papers, number of collaborative papers, and the proportion of papers that are collaborative. The five ways to describe the collaborations are listed below with the top university in “proportion of papers” for each category.
- Overall collaboration ( all inter-institutional collaborations) – Université Paris IV Paris-Sorbonne. France dominates this list with17 universities in the top 25!
- International collaboration – King Abdulaziz, Saudi Arabia has three of the top five.
- Collaboration within 100 km (62 miles) -Peking Union Medical College. Three Taiwanese universities are in the top eight
- Collaboration greater than 5,000 km (3107 miles-) – University of Hawaii Manoa.
- Collaboration with industry- China University of Petroleum Beijing. Three Chinese Petroleum universities are in the top five
Table 35.2 compares impact and collaboration. In the impact module the default is to use fractional counting. For example, if there are three authors from three institutions, each institution receives 1/3 count. In the collaboration module, every author’s affiliation receives a full count. Two universities with the highest proportion of top 10% publications are also in top ten in proportion of collaborative publications. Tsinghua University, China’s top ranked university for proportion in the top ten percent, is ranked 746 for proportion of collaborative papers.
Table 35.3 lists the top ten in number and proportion of international papers. It also lists the proportion of collaborations with over 5,000K (3,107 miles). It is no surprise that most are universities far from scholarly hubs, such as my local University of Hawaii. In addition to UH, three are from Chile, three from South Africa and one each from New Zealand and Australia.
The indicators that are used are those agreed upon between CWTS and data supplier Clarivate Analytics (Waltman 2018 email).
OTHER SOURCES OF COLLABORATION – Underlying Data Sources
Claravite Analytics (CA)- InCites
Articles on individual fields of study in individual countries concluded that collaboration resulted in more highly cited papers. Based on data in Table 35.2, that does not seem to be the trend for overall collaborations. Persson (2010) concluded that “international papers are not well represented among high impact papers”
In its Organizations, Regions and Research Areas modules InCItes has metrics for number and percent of international collaborations and percent of highly cited papers. For author, it has the two collaboration metrics but only number of highly cited papers. I drilled down in the Organizations module, using the time range 2015- present with a minimum of 5,000 papers. 48 different institutions comprised the list of the top 25 in percent of either highly cited papers or the percent of international collaborations. See Table 35.4 for the top ten list. Only one institution was on both lists. 78% of the dataset of 640 institutions are universities. Universities comprise 70% of the top collaborative institutions and 58% of the highly cited. 10 institutions involved in medical research are in the top 50 of highly cited; only one is on the collaborative list.
SiVal’s Overview module has limited collaboration data. Select an institution and drill down to find a percent that is highly cited and a percent that is co-authored with institutions in other countries. SciVal has a stand-alone Collaboration module with detailed data for institutions or countries. Select one institution and generate a list of collaborative sites based on number of publications. Preselected data includes Co-authored publications, Co-authors at the institution and Co-authors at the other institution. See Example 35.1 for Peking University’s profile. The Collaboration module suggests potential collaborating universities. There are no comparisons across institutions or countries
OTHER SOURCES OF COLLABORATION – Ranking Organizations.
Times Higher Education – Emerging Economies Update with Collaborations
In May, THE issued its fifth Emerging Economies rankings, including 42 countries and 378 universities. This is renamed from BRICS. THE uses the FTSE classification, based on stock exchanges, for classifying countries. This differs from the World Bank income classification in which 14 of these economies are considered High Income and another 15 upper middle income. [NOTE: World Bank excel file is not compatible with Office 365]
Like the other THE spinoffs, Emerging Economies uses the same universities as the world rankings and the same data elements which are recalibrated. International collaboration is 2.5% in world rankings under the International Outlook indicator, which is worth 7.5% overall. International Outlook is worth 10% and international collaborations is raised to 3.4%. Seven of the top 10 overall are from China and the top five in international outlook are from the Middle. East.
U-Multirank 2017 Readymade Research and Research Linkages
U-Multirank includes international publications as part of International Outlook, using Web of Science (WOS) data, as explained in its methodology. Institutions receive grades from A to E. Customize a university set and re-rank by metric. 27 received an A in international publications. Despite my usual frustration with U-Multirank, it is a site where both international collaborations and top cited papers are displayed in one table.
International collaborations comprise 10% of the rankings. Five percent is the proportion of publications that contain international co-authors, based on WOS data. According to the methodology. the other five percent is that proportion divided by the proportion from the country the institution is in. U.S. News Global only displays the rank and the total score. The only way to find universities’ collaborative rankings is to click on each university.
SCImago (SIR) includes international collaboration in its research factor with a weight of 2%. No data are available.
QS and Webometrics do not include collaboration in their metrics.
U21 released its seventh ranking of the top 50 university systems on 11 May 2018. One of five metrics is “Connectivity”. Four percent includes articles co-authored with international collaborators and another four percent are articles co-authored with industry. Table 35.5 shows the top 10 countries in the world overall for collaboration and all the Asian countries in the ranking. It also shows the rank for international collaborations. There has been little change since the last time we looked at U21 in the Appendix of Ruth’s Rankings 20. Saudi Arabia made the biggest gain, going up five places. China and the United States have the biggest negative gap between their overall rank and their connectivity rank. U21 notes that larger countries often have lower connectivity scores. Saudi Arabia was number one in international collaborations. The full report is the source for the connectivity data. Download full report.
The proportion of collaborative papers keeps growing along with the number of publications overall. Much more research needs to be done to get a definitive answer to whether more collaboration leads to more highly cited papers.
Malaysian University Rankings – SETARA
When I was writing Ruth’s Rankings 24 last year, I was frustrated by the absence of an updated internal SETARA ranking. The Ministry of Education skipped 2015 and recently released SETARA 2017, which focuses on teaching, research, and service. Universities have been divided into three categories, using the same three areas of focus but with different requirements and weightings: Mature universities >= 15 yeas old; Emerging universities with <15 years and University colleges. No universities received a six-star rating in SETARA 2013. Six mature universities received that rating this year, including one off-shore branch. 21 universities, coming from all three categories, received five stars.
Clarivate Analytics Incites
CA reorganized the way it presents its data in Incites by dividing its bibliometrics into five categories which makes it easier to find relevant metrics.
- Productivity – percent of document by Q1 to Q4 JIF journals
- Impact – citation metrics
- Collaboration – international and with industry
- Reputation – surveys and organization- supplied data
- Other – filters
The next article will be on ASEAN universities with emphasis on Thailand, Indonesia
and the Philippines. If any of you have information about the organization of your country’s higher education system, other than Singapore and Malaysia which were covered in earlier articles, please send me an email at email@example.com.
Bothwell, E. (9 May 2018). THE Emerging economies university rankings 2018: Results announced. https://www.timeshighereducation.com/world-university-rankings/emerging-economies-university-rankings-2018-results-announced
Persson, O. (2010) Are highly cited papers more international? Scientometrics 82(2). Access abstract at https://link.springer.com/article/10.1007/s11192-009-0007-0
Waltman, Ludo and van Eck, Nees Jan.(11 April 2018). Analyzing the activity of visitors of the Leiden Ranking website. arXiv.org, accessed 6 May 2018 at https://arxiv.org/ftp/arxiv/papers/1804/1804.03869.pdf
- Introduction: Unwinding the Web of International Research Rankings
- A Brief History of Rankings and Higher Education Policy
- Bibliometrics: What We Count and How We Count
- The Big Two: Thomson Reuters and Scopus
- Comparing Times Higher Education (THE) and QS Rankings
- Scholarly Rankings from the Asian Perspective
- Asian Institutions Grow in Nature
- Something for Everyone
- Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
- Do-It-Yourself Rankings with InCites
- U S News & World Report Goes Global
- U-Multirank: Is it for “U”?
- A Look Back Before We Move Forward
- SciVal – Elsevier’s research intelligence – Mastering your metrics
- Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
- The much maligned Journal Impact Factor
- Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
- Rankings from Down Under – Australia and New Zealand
- Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
- World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
- Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
- Indian University Rankings – The Good the Bad and the Inconsistent
- Are Global Higher Education Rankings Flawed or Misunderstood? A Personal Critique
- Malaysia Higher Education – “Soaring Upward” or Not?
- THE Young University Rankings 2017 – Generational rankings and tips for success
- March Madness –The rankings of U.S universities and their sports
- Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
- Japanese Universities: Is the sun setting on Japanese higher education?
- From Bibliometrics to Geopolitics: An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
- Hong Kong and Singapore: Is Success Sustainable?
- Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
- The Business of Rankings – Show me the money
- Authors: People and processes
- Authors: Part 2 – Who are you?
- Come together: May updates lead to an investigation of Collaboration
- Innovation, Automation, and Technology Part 1: From Scholarly Articles to Patents; Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
- How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?
- Coming Attractions: The UN Sustainable Development Goals and Times Higher Education Innovation and Impact Rankings Demystified
- Business School Rankings: Monkey Business for an Asia/Pac audience
- Deconstructing QS Subjects and Surveys
- THE’s University Impact Rankings and Sustainable Development Goals: Are these the most impactful universities in the world?
- ASEAN – a special analysis of ASEAN nations
- Predatory practices revisited – misunderstandings and positive actions
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674