Ruth’s Rankings 18: Rankings from Down Under – Australia and New Zealand

By Ruth A. Pagell*

(6 April 2016) This article focuses on university education in Australia and New Zealand.  We look at the roles of governments in evaluations, national ratings and rankings, and the international rankings.  Australia and New Zealand have layers of government organizations evaluating quality through metrics and peer review. The focus of both countries’ internal evaluations is subjects, which will be covered in depth in the following article.

AUSTRALIA

Australian university performance is well documented both from government initiatives and commercial sources. 

Overview of Australian Universities

Australia has 41 universities recognized by the government’s Australian Research Council, ARC These include12 institutions accredited during the past 25 years from a mix of new universities and conversions of  “colleges” to university status. The term “college“ generally refers to non-degree institutions called  TAFE, or Technical and Further Education, which are administered at a state level. See Table 18.1 Australian Universities for the founding and conversion dates.  Some of the universities have formed subgroups. The best known is the Group of Eight.  

Australian Government Oversight

The Australian Research Council provides research funding and evaluates performance through Excellence in Research for Australia (ERA).  There are three rounds of ERA:  National Reports 2010 and 2012 and State of Australian University Research, Volume 1 Excellence in Research in Australia (2015). Evaluations emphasize output and income at the level of individual fields of study.  ERA does not rank universities. “ERA evaluates the research undertaken in institutions by discipline and provides ratings for fields of research at each institution.”  The full report is 440 pages – http://www.arc.gov.au/era-reports  and has an accompanying Evaluation Handbook and Submission Guidelines.

Among the objectives of ERA are to:

  • establish a framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australian higher education institutions;
  • Identify excellence across the full spectrum of research performance;
  • allow for comparisons of research in Australia, nationally and internationally, for all discipline areas. (ERA Submission Guidelines, 2015; p. 9)

Methodology:

ERA bases research for 2015 on outputs from 2010-2013.  The types of outputs have been expanded to include “non-traditional” eligible research types such as “creative works”. (Submission Guidelines section 5.4.9).  Indicators fall into the following categories:

1, Research Quality – publishing profile, citation analysis, ERA peer review, and peer reviewed Australian and international research income.

2.  Research Activity -research outputs, income and other items within the context of the profile of eligible researchers

3,  Research Application -  research commercialisation income such as patents, Plant Breeder’s Rights, registered designs, and National Health and Medical Research Council (NHMRC) Endorsed Guidelines.

4.  Recognition – esteem measures (ERA Submission Guidelines p. 11)

The ERA 2015 Evaluation Handbook provides a detailed explanation of the indicators and rating systems. Individual institutions provide the data. In his post questioning the reliability of ERA, Henman (2015) suggests that ARC use publically available data.

NEW ZEALAND

Overview of New Zealand Universities

New Zealand has eight universities. Four were founded as universities in the 1800s. Only two are under 50 and one is on the 2016 QS top 100 under 50 lists.  Most of the tertiary institutions are not classified as universities but institutes of technology and polytechnics.   

New Zealand Government Oversight

New Zealand has several bodies involved with evaluating and funding tertiary education. 

The Tertiary Education Commission http://www.tec.govt.nz/  oversees all aspects of tertiary education including the eight New Zealand universities. TEC issues an annual report which measures education performance based on student participation, course completion and qualification completion. Its funding and evaluation division is PBRF – Performance Based Research Fund which has one of its objectives to remain competitive in international rankings.  Each university submits a list of academics and their research records. The academics are then graded and scored. The scores are aggregated to give a measure of average “quality” which is reported by subject groups within reach universities and for each university (http://www.tec.govt.nz/Funding/Fund-finder/Performance-Based-Research-Fund-PBRF-/).

Objectives of PBRF are to:

  • increase quality of basic and applied research at New Zealand’s degree-granting tertiary education organisations (TEOs)
  • support world-leading teaching and learning at degree and postgraduate levels
  • assist New Zealand’s TEOs to maintain and lift their competitive rankings relative to their international peers,

(http://www.tec.govt.nz/Documents/Reports%20and%20other%20documents/PBRF%20QE%202012%20Final%20Report.pdf). See Table 18.2, New Zealand Universities Founding Date and Quality Rank.

The representative body for New Zealand’s eight universities is Universities New Zealand “Te Pökai Tara” under the auspices of The New Zealand Vice-Chancellors’ Committee.  It is responsible for the approval and quality assurance of university qualifications.

In early 2015 The New Zealand Parliament’s Education and Science Commission’s annual review of TEC expressed “concern that New Zealand’s universities are reportedly losing ground in international rankings…TEC proposes linking Crown research institutes—which will retain independent governance—with universities to improve this situation.”  (http://www.parliament.nz/resource/en-nz/51DBSCH_SCR62715_1/06308fc2e75acb901c3d4b0e0af72a18529e8f5b)

Radio New Zealand’s education correspondent John Gerritsen (May 2015) followed up on the review with a report that “Universities plan to boost world rankings” (8 May 2015). In September, soon after the release of the new QS rankings, Gerritsen reported that the universities’ rankings improved in the QS 2015-2016 rankings, not because of anything the government or universities had done but due to changes in QS’ methodology, which now puts less emphasis on citations. (Gerritsen, Sept  2015; Simon, 2014)

AUTHOR’S NOTE:  Using annual data from Web of Science’s InCites, New Zealand did drop from 36 to 39 in total output and 32 to 37 in total citations from 2010 to 2014-2015. For comparison, Australia remained at either nine or ten for both metrics over the same time. This does not mean that output has been negative or that its share of the world’s output has dropped, as shown in Figure 18.1.

RANKINGS

Both countries promote the QS rankings which have less emphasis on scholarly indicators than the other rankings and more on international students and faculty and on reputation or peer review.

Summary of the number of universities in top 100 ranks in QS 201-2016

AUSTRALIA      NEW ZEALAND

International faculty                            17                    6

International students                         11                    1

Top100 worldwide                               7                     1

Academic reputation                           6                     1

Citations per faculty                            3                     0

Both countries’ internal metrics highlight performance at the subject level which will be covered in the next article.

The Australian government, as noted above, only provides ratings. Private organizations provide national rankings, using existed data sources.

The Australian Education Network turns the ERA ratings into rankings (http://www.australianuniversities.com.au/). The Main Australian Rankings Table provides the latest rankings from major international sources.  Click on any university name and see its world rankings, and local rankings for ERA, student satisfaction and professional schools.

Australian University Reviews (AUR) calculates its own score using rankings from QS, ARWU and Webometrics, graduate satisfaction ratings and adjusted graduate salary results (https://universityreviews.com.au/australian-rankings/).

The Good Universities Guide extracts data from the Australian Graduate Survey and the Department of Education and Training data. The website does not provide a list. See Barkhausen (2015) for a summary of the latest guide’s ratings: http://www.universityworldnews.com/article.php?story=2015082805450845.

International Rankings

The youth of Australian universities and their proportion of the top 100 under 50 are notable.  There are16 Australian universities on the 2015 QS and THE lists of 100 under 50, with a total of 19 different universities between the two.  Only one New Zealand University is on the QS list.

See Table 18.3 for international and local Australian rankings.

See Table 18.4 for New Zealand rankings.

The number of Australian and New Zealand universities ranked by QS or THE has more than doubled since the initial combined QS/THE ranking in 2004.  At the same time, the number in the top 200 has dropped and only one university had a slightly better score than in 2004. Results for ARWU are more promising with more universities joining the list and existing universities moving up (Table 18.5).  Without looking at the ranker’s methodology and the underlying data we cannot conclude that these universities are performing better or worse.

CONCLUSION

Not every university can or should be a “world class” university but most institutions want to do the best they can with the boundaries set by their resources. Using the rankings as presented by the best-known rankings organizations such as QS, Times Higher Education and ARWU does not tell the whole story for universities at the regional or local level.  The only data available is usually faculty and student numbers submitted by the universities but not the proprietary underlying data from Web of Science and SciVal.  The savvy policy maker or administrator looks beyond the rankings and scores to the underlying data and methodology from either Elsevier’s SciVal or Web of Science InCites. She also looks beyond composite scores to subject or indicator specialties.

Part 2 of our exploration of the performance of Australian and New Zealand universities focuses on the value and vagaries of subject rankings.

Thanks to John Lamp, emeritus Prof from Deakin University for clarifying Australia http://lamp.infosys.deakin.edu.au/  and John Gerritsen, education correspondent, Radio New Zealand John.Gerritsen@radionz.co.nz

Barkhausen, B. (28 August, 2015) New guide shines spotlight on regional universities.  In University World News, retrieved 28 March 2016 at http://www.universityworldnews.com/article.php?story=2015082805450845

Gerritsen, John (8 May 2015).  Universities’ plan to boost rankings.  Radio New Zealand, retrieved 3 April 2016 at http://www.radionz.co.nz/news/national/272210/universities’-plan-to-boost-world-rankings

Gerritsen, John (15 Sept 2015)  NZ universities’ rankings rise.  Radio News Zealand,  retrieved 3 April 2016 at  http://www.radionz.co.nz/news/national/284234/nz-universities’-rankings-rise

Criticism of ERA:

Henman, Paul (7 Dec 2015) Are Australian universities getting better at research or at gaming the system?  The Conversation, retrieved 28 March 2016 at  http://theconversation.com/are-australian-universities-getting-better-at-research-or-at-gaming-the-system-51895

Source of 2004 THE/QS rankings

Praphamontripong, Prachayani and Levy Daniel (2005) World university rankings 2004 modified from the Times Higher Education Supplement  from International Association of Universities, 2005. International Handbook of Universities, 18th ed., New York, NY: Palgrave Macmillan http://www.albany.edu/dept/eaps/prophe/data/International_Data/WorldUniversityRanking2004_ModifiedFromTHES.pdf

Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A look back before we move forward
  14. SciVal – Elsevier’s research intelligence –  Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?

*Ruth A .Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.   Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – orcid.org/0000-0003-3238-9674.