Ruth’s Rankings 33: Authors: People and processes

By Ruth A. Pagell*

  1. Which country has the most authors?
  2. Which research organization has the most authors in the world?
  3. Which university in Asia has the most authors?
  4. Which author at Nanjing Tech has the most citations in the field of Physics and Astronomy? (1)

(17 March 2018) Two themes run through this article: the processes used by Clarivate Analytics and Elsevier to identify authors and assign affiliations which are incorporated into rankings; and the growth in the number of authors worldwide.  China leads the USA in SciVal and is growing at a rate multiple times faster than the USA in InCites. Part One will answer questions one through three.

I became interested in the relationship among authors, publication, and citations while preparing my seminars for Hong Kong and Singapore. Thinking of scholarly communications as a process, the author provides the input, the publication is the output, and the citation is the outcome.  For accurate rankings, authors’ institutional affiliations must be correct

The first step in understanding author data is understanding how the data providers count authors, their institutions, and their countries.

Using Scopus data, China leads the world with almost 20% more authors than the USA.  Compared to the US, it has two/thirds the number of publications, half the citations, and a higher author growth rate. 12 countries make up the top ten as seen in Table 33.1.  Seven are top ten in all three categories.  Four are below the world average in SciVal’s Field Weighted Citation Impact (Colledge, pg.69).  A loser to China is Taiwan, the only major eastern Asian country showing a negative growth rate in number of authors. Taiwan also has the largest percent loss in number of publications. This is attributed to China’s equal state policy (Leung and Sharma, 2018).

See Appendix 33.A.1 for a complete list of Asian countries. The ASEAN countries of Indonesia, Vietnam, Myanmar, Laos and Philippines have high rates of growth along with Sri Lanka and North Korea.

Appendix 33.A includes the following tables, with data from Scopus on institutions with the top ten authors, publications, and citations, as presented in SciVal.

  • Table 33.A.2: World Rankings. China has four institutions for authors and the US has four institutions for citations. The Chinese Ministry of Education has the most authors, the Chinese Academy of Sciences the most publications, and Harvard University the most citations.  Four universities are in the top ten for all three categories.
  • Table 33.A.3: Asia. Shanghai Jiao Tong University has the most authors and publications and the University of Tokyo the most citations.  Six of the top 10 for authors are Chinese.  For field weighted citation impact, Hong Kong which has no university in the top 100 in any of the three main categories has two of the top three field weighted citation impact scores, ranked between 1100 and 1200 in the world.
  • Table 33.A.4. China.13 Chinese institutions make up the top ten in all categories; 29 are in the top 100 in number of authors

AUTHOR IDENTIFCATION from DATA PROVIDERS

We introduced Thomson Reuters (now Clarivate Analytics -CA)  and Elsevier in Ruth’s Rankings 4, with two follow-up articles: Ruth’s Rankings 10 (InCites) and Ruth’s Rankings 14   (SciVal). RR 33 investigates how they  handle author names and assign credit to the authors, institutions, and countries.  I base results on my drilling down into the data and from information directly from the providers.

The providers use the terms authors, researchers, people and names. When referring to a specific product, we will use the official terminology and use authors for general discussions.

A major issue in the first years of rankings was name disambiguation. This was especially true for Asian names. The legacy Web of Science, based on 1970 printed citation indexes, used “last name” and initials. Although disambiguation of author names is no longer in the forefront of concerns today, drilling down in CA and Elsevier products reveals that there are still problems in attaching names to articles.

Both companies use a similar process when compiling the data.

Each publication is counted one time in Web of Science and Scopus.

  • The publication belongs to the researcher/author, and in today’s scholarly environment, multiple authors. The publication follows authors as they change affiliations but,
  • Institutional and country affiliations are assigned to the publication at the time of publication which means that,
  • One publication appears multiple times, under each author, institution and country in Incites and SciVal

Clarivate Analytics

If your last name is Pagell attaching the right articles to the right author is easy.  The databases have only four Pagells.  What if you are looking for an author whose name is HUANG Wei? In searching the name in WoS, I received about 9,000 publications, since it searches the family name and initials. The author index does not help since it does not include the personal name. ResearcherID, to be discussed in more detail in Part Two of this article, allows authors to register and consolidate their articles under one name.  See Appendix 33.B for an interview with CA’s Product Development manager explaining CA’s process.

Within WoS, each record includes all authors’ names and affiliations, the number of citations, and a usage count.  Also provided as filters are Organization-Enhanced affiliations and collaborating authors, subject areas and types of publications. The database generates an author’s citation report for a data set of up to 10,000 records.  The report displays total publications, h-index, cites per item, times cited and citing article.

I usually go to InCites for comparison data and tables even though InCites cannot generate tables like those in Appendix 33.A from SciVal. InCites has a module called People.  The People module tracks NAMES.  Every variation of an author’s name and affiliation is a different entry. The module starts in 1980 and has over 39 million names. The name with the most publications is “Anonymous”. Names with no affiliations are also listed. See Example 33.1.A,for  an author with multiple name entries from InCites

InCites does have over 30 different author metrics available in tabular form and visualizations in a Research Report (Figure 33.1):

  • Documents: Number of publications cited one or more times (in WOS publications); count of documents in journals with a JIF in a given time frame (Journal Impact Factor from JCR, Journal Citation Reports); count of documents in Q1, Q2, Q3 or Q4 JIF journals in a given time frame;
  • Papers – From ESI (Essential Science Indicators): Number of highly cited papers (articles and reviews) that rank in the top 1% by field and year; percent of highly cited publications in top 1% by field and year; hot papers, publications in top 0.1% by citations for field and age;
  • Collaboration:  Percent of publications with international and industrial co-authors
  • Demographics:  Affiliations and country/region
  • Citations:   Category normalized citation impact, normalized for subject, year and document type; citation impact, average (mean) citations per paper; journal normalized citation impact, citations per paper normalized for journal, year and document type

Elsevier

Elsevier starts its author process in Scopus.  Unlike WoS and InCites, which have no disambiguation or consolidation tools within the databases, Elsevier has tried to disambiguate and then consolidate author names automatically. This leads to fewer records for the same author as seen in Example 33.1.B. Authors receive a Scopus identifier. Authors who were aware of Scopus and its profiles have the opportunity to make changes using the Scopus Feedback Wizard.  For each profile, Scopus provides a document/citation chart and total number of documents, citations and h-index with an h-index graph

Figure 33.2.  Scopus Author Graph

To access author data for downloading and re-sorting, I use SciVal, which has two limitations. It includes only a few preset time periods and only displays the top 500 authors based on publications.  The problem with this approach is that three of the top ten authors in the world based on publications have a total of one citation.

In SciVal, tabular data includes authors names, publications, latest publication date, citations and h-index. SciVal also has more appealing visualization tools than InCites. Scopus datasets can be imported into SciVal as shown as part of the following example.

EXAMPLE 33.2:  Multiple Authors

To get a better understanding of the processes, especially articles with multiple authors and affiliations, I sent a simple sample publication to both Clarivate Analytics and Elsevier. Example 2 starts with the article as it appears on the journal website and tracks the steps that CA (Example 33.2A) and Elsevier (Example 33.2.B) use to incorporate the authors’ names into their products.

CONCLUSION:

Ruth’s Rankings 33 explains how an author’s name becomes part of the bibliometrics at the institutional and country affiliation levels. Author data alone are NOT reliable for measuring an institution’s or country’s performance. Issues with name disambiguation continue.  The increase in number of authors, especially Asian authors, result in more authors with similar family names. The author with the most publications in InCites is “anonymous”.

There also seems to me to be a trend to include more authors per article.  This includes the possibility that well-cited authors’ names are added to an article to attract more citations. I discovered an author in Incites who had over 300 publications in one year.  I ran this by a journal editor who told me that medical journals are now asking for an explicit statement of each authors’ contribution as are funding agencies.

Ruth’s Ranking 34, Part Two of Authors, will explore the different tools available for authors to control how their names appear. It will also look at lists of highly cited authors and collaborations.

References:

(1)  From SciVal, searched 18 February, 2018 from 2014-2918

Leung, M. and Sharma, Y. (2018) China’s equal status policy causes Taiwan brain drain.  University World News, 01 March 2018 Issue No:495 accessed at http://www.universityworldnews.com/article.php?story=20180301165622684

Clarivate Analytic Resources

Incites Indicators Handbook

Pagell, Ruth A.  (2015).  InCite’s benchmarking and analytics capabilities. Online Searcher. 39(1) 16-21.

Pagell, Ruth A.  (2014).  Insights into InCites:  Journal citation reports and Essential Science Indicators.  Online Searcher.  38(6).  16-19.

Elsevier resources

Colledge, L. and Verlinde, R.  (February, 2014). SciVal Metrics Guidebook, version1.01. pg 11, Accessed 3/11/18 at  http://www.elsevier.com/__data/assets/pdf_file/0020/53327/scival-metrics-guidebook-v1_01-february2014.pdf

SciVal User Guide (updated February 4, 2015). Accessed at http://www.op.mahidol.ac.th/orra/SciVal/SciVal_USER_GUIDE.pdf.

Scopus Content Coverage Guide (August 2017). Pg 13,  accessed March2018 at https://www.elsevier.com/__data/assets/pdf_file/0007/69451/0597-Scopus-Content-Coverage-Guide-US-LETTER-v4-HI-singles-no-ticks.pdf

Ruth’s Rankings

  1. Introduction: Unwinding the Web of International Research Rankings
  2. A Brief History of Rankings and Higher Education Policy
  3. Bibliometrics: What We Count and How We Count
  4. The Big Two: Thomson Reuters and Scopus
  5. Comparing Times Higher Education (THE) and QS Rankings
  6. Scholarly Rankings from the Asian Perspective 
  7. Asian Institutions Grow in Nature
  8. Something for Everyone
  9. Expanding the Measurement of Science: From Citations to Web Visibility to Tweets
  10. Do-It-Yourself Rankings with InCites 
  11. U S News & World Report Goes Global
  12. U-Multirank: Is it for “U”?
  13. A Look Back Before We Move Forward
  14. SciVal – Elsevier’s research intelligence –  Mastering your metrics
  15. Analyzing 2015-2016 Updated Rankings and Introducing New Metrics
  16. The much maligned Journal Impact Factor
  17. Wikipedia and Google Scholar as Sources for University Rankings – Influence and popularity and open bibliometrics
  18. Rankings from Down Under – Australia and New Zealand
  19. Rankings from Down Under Part 2: Drilling Down to Australian and New Zealand Subject Categories
  20. World Class Universities and the New Flagship University: Reaching for the Rankings or Remodeling for Relevance
  21. Flagship Universities in Asia: From Bibliometrics to Econometrics and Social Indicators
  22. Indian University Rankings – The Good the Bad and the Inconsistent
  23. Are Global Higher Education Rankings Flawed or Misunderstood?  A Personal Critique
  24. Malaysia Higher Education – “Soaring Upward” or Not?
  25. THE Young University Rankings 2017 – Generational rankings and tips for success
  26. March Madness –The rankings of U.S universities and their sports
  27. Reputation, Rankings and Reality: Times Higher Education rolls out 2017 Reputation Rankings
  28. Japanese Universities:  Is the sun setting on Japanese higher education?
  29. From Bibliometrics to Geopolitics:  An Overview of Global Rankings and the Geopolitics of Higher Education edited by Ellen Hazelkorn
  30. Hong Kong and Singapore: Is Success Sustainable?
  31. Road Trip to Hong Kong and Singapore – Opening new routes for collaboration between librarians and their stakeholders
  32. The Business of Rankings – Show me the money
  33. Authors:  People and processes
  34. Authors: Part 2 – Who are you? 
  35. Come together:  May updates lead to an investigation of Collaboration
  36. Innovation, Automation, and Technology Part 1:  From Scholarly Articles to Patents;     Innovation, Automation, and Technology Part 2: Innovative Companies and Countries
  37. How Important are Journal Quality Metrics in the Era of Predatory Journals? Part 1: Journal Citation Metrics; Part 2: How Important are Journal Quality Metrics in the Era of Potential/ possible/ probable predatory publishers and publications?

*Ruth A. Pagell is emeritus faculty librarian at Emory University.  After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii.  She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674

.