By Ruth A. Pagell*
- Who is the best researcher in the world?
- Who has the most citations in Asia?
- Who has the most citations at Shanghai Jiao Tong University?
- Who is credited with the most citations in Physics and Astronomy at Nanjing Tech?
See Conclusion for answers.
(3 May 2018) Ruth’s Rankings 33 describes the processes that connect researchers with their articles and affiliations. This article lists the tools that disambiguate author names and consolidate the list of their publications and it examines ways to identify top researchers.
AUTHOR IDENTIFIERS
Authors should register their names and be consistent in how they present their names and affiliations. Personal identifiers may be non-proprietary, such as the ORCID or proprietary and part of companies such as Clarivate Analytics or Elsevier. Identifiers vary in the ways that IDs are created and in what publications are included.
Non-proprietary
ORCID (Open Researcher and Contributor ID)
ORCID serves as a central identifier registry. On April 28, 2018 there were over 4.7 million ORCID iDs. Authors register themselves to get an ORCID number. The identifier links authors to other identifiers such as Scopus and ResearcherID. ORCID numbers are harvested and appended to Web of Science records monthly.
ORCID provides more information on its FAQ page. ORCID now provides a personal QR Code.
Some publishers, such as PLOS, The Royal Society and Wiley require ORCID iDs.
Figure 34:1. Wiley Banner
Search the ORCID database: https://ORCID.org/ORCID-search/search
ISNI (International Standard Name Identifier – ISO 27729)
“ISNI is the globally recognized standard approved by ISO for the unique identification of public identities across all fields of creative activity.” It has records for almost 10 million identities of whom 2.9 million are researchers. The ISNI is created automatically and links to other authority files. The ISNI is not searchable. I have an ISNI, based on a book. Mine links to one of the sources that feeds ISNI, Virtual International Authority File, hosted by OCLC. Imbedded in my VIAF record are other identifiers, such as ORCID, the ISNI, a French number and an Israeli number.
Proprietary Identifiers
Both Elsevier and Clarivate Analytics (CA) use author identifiers which were introduced in Ruth’s Rankings 33. Their methodologies differ.
The Scopus identifier automatically disambiguates authors based on name, subject and affiliation. It includes only authors with articles indexed in Scopus. Authors can make changes using the Scopus Feedback Wizard. Only Scopus articles are in the profiles. Scopus automatically adds articles to the ORCID profile. It may err by assigning more than one number to the same researcher or combining more than one researcher under the same ID.
Using the Scopus Identifier database, it is easy to identify the same author with different IDs. https://www.scopus.com/freelookup/form/author.uri
ResearcherID requires self-registration. It includes about one million authors. It is integrated with Web of Science and ORCID. Articles indexed in Web of Science are automatically added to the profile and an h-index is calculated. Researchers must enter any publication not in their organization’s subscription to WOS. Most WOS authors do not have IDs and the ID does not help consolidating an author’s works under one name. Search ResearcherID database.
Web-Based Identifiers
Google Scholar Profiles
Authors create their own profiles which are verified by email address, as explained in the methodology. Articles in Google Scholar are automatically added. Authors manually add their own publications or set up a profile for someone else. They can link to their home page. A sortable list is generated including article title, year and number of citations and an h-index is calculated.
Mendeley, owned by Elsevier, is a reference management tool and a place to create your personal profile. Anyone can sign-up and create a profile. Mendeley links back to ORCID and Scopus. If you use Mendeley for your research management tool, the articles you put on your desktop can be synchronized with your Mendeley website. Statistics include h-index and citations from Scopus and number of Mendeley readers and ScienceDirect views. Search Mendeley
Researchers create their own profiles and add their articles. RG states that it currently has over 14 million members. Names are verified by institutional email account. ResearchGate provides statistics on reads by article, country and institution if that information is available. It also tracks citations inside ResearchGate.
Personal Websites
Authors should keep their personal webpages up to date and connect to them through ORCID or Google Scholar. In trying to disambiguate names for the section below on rankings, I ended up using the researchers’ own pages.
The problem is not the existence of a tool but too many tools. There are more specialized tools not listed here. Keeping up-to-date with even one tool is time consuming. Many authors do not have any IDs, many do not check them, and certainly many have not read the methodology. For more detailed information, read the article by Mearing (2017) listed below.
IDENTIFYING HIGHLY CITED RESEARCHERS
Two lists regularly track highly cited researchers. One is from Clarivate Analytics, using Essential Science Indicators (ESI) and WOS data and the other is from Webometrics, using Google Scholar data. It is a mistake to call the lists “rankings”. They have limitations and cannot be replicated. I examined both lists, cross checked them, and went back to underlying metrics from CA and Elsevier. The results were frustrating at best!
CA’s Highly Cited Researchers contains no metrics. The 2017 list includes 3,400 researchers with their names, affiliations, and ESI subject areas. Clinical medicine has the most researchers with 390; engineering has 175 and mathematics has the least with 95. 30% are from the U.S. and UK. China comes in third. Harvard has the most authors followed by Stanford, Germany’s Max Plank Institute and the Chinese Academy of Sciences. Other Asia-Pacific institutions having more than ten researchers are Peking, Tsinghua and Zhejiang universities, Nanyang University of Technology and University of Hong Kong. University of Melbourne has more than 20.
The methodology uses 11 years of articles and reviews in the top 1% of citations. CA applies an additional calculation to determine how many researchers to include for each field. The methodology was updated in 2013 in conjunction with the ARWU rankings (Thomson Reuters, 2013). Online, names are presented alphabetically as shown in Figure 34.2. Either search by individual or field or download the list for re-sorting. See Figure 34.3, re-sorted by university and subject.
Google Scholar Citations- 2016 Highly Cited Researchers (h>100) according to their Google Scholar Citations public profiles
Cybermetrics Lab (part of CSIC Consejo Superior de Investigaciones Cientificas), first introduced to ACCESS readers in Ruth’s Rankings 9, released the eighth edition of highly cited researchers the first week of April 2018. Ranked researchers must have a Google Scholar profile described above and an h-index greater than 100. Rankings are calculated first by h-index and then by number of Google Scholar citations. Indicators for each researcher include Rank, Researcher (Personal name first), Organization, H-Index and Citations. Authors with common names are asked to disable automatic updating of their Scholar profile.
Cybermetrics acknowledges that “This ranking is far from being complete” since many researchers do not have Google profiles. There are no date limitations and the researcher does not have to be alive. Number one is Sigmund Freud. The Google Scholar Profile is available by clicking on an author’s name. The online list cannot be resorted by any of the indicators. It can be copied into a spreadsheet and manipulated as shown in Table 34.1.
The top five in the world are:
- Sigmund Freud, University of Vienna, (d. 1939) [Psychology]
- Graham Colditz, Washington University, St. Louis [Medicine, Public Health]
- Ronald C. Kessler, Harvard Medical School [Health Care Policy]
- JoAnn E Manson, Harvard Medical School [Endocrinology]
- Shizuo Akira, Osaka University [immunology]
Countries are not listed. I scanned the list to identify Asian researchers in the top 100:
39. S B (Soo-Bong) Kim, Seoul National University [Physics]
67. H J Kim, Kyungpook National University [Physics]
92. Yusuke Nakamura, U Tokyo until 2013; now U Chicago [Medicine]
The world’s most highly cited are from medical fields. In China, the pattern is different as shown in Figure 34.4. The top Chinese researcher, coming in at 440, is Xurong Chen from the Chinese Academy of Sciences, in physics.
See Appendix 34.A for using SciVal and InCites that includes Example 34.1, searching for an author in Scopus, and Table 34.2, the relationship between CA’s Highly Cited Researchers to rankings in ARWU. Appendix 34.B highlights an initiative from THE and includes Table 34.3., showing top authors in the world and Asia, in specific subject fields.
CONCLUSION: Who is the best researcher in the world?
2. Shizuo Akira has the most citations in Asia (Google Scholar)
3. Yang Haijen has the most at Shanghai Jiao Tong University (Google Scholar).
4. Huang Wei leads Nanjing Tech in Physics and Astronomy and overall for 2012 through April 2018 using either SciVal or InCites data.
Are Highly Cited researchers the most influential? Using our bibliometric tools, the best may best be defined by a specific subject area and time. However, many highly cited researchers publish in multiple fields, sources use different subjects for the same researcher and the researchers may use different subjects in their profiles. Another answer is the list of the worlds’ 923 (through 2017) Nobel prize winners, many of whom do not appear on highly cited lists. The best researcher in the world is in the eye of the beholder.
INTERESTING ARTICLES AND UPDATES
False investigators and coercive citation are widespread in academic research says London School of Economics and Political Science blog, from an article written December 2017. http://blogs.lse.ac.uk/impactofsocialsciences/2018/03/05/false-investigators-and-coercive-citation-are-widespread-in-academic-research/. This addresses one of my concerns. As I studied the citations’ lists I found recurring names attached to articles from similar organizations on similar subjects.
Hinchcliffe, L.A. (2 May 2018). If ResearchGate is where authors connect and collaborate…Scholarly Kitchen accessed at https://scholarlykitchen.sspnet.org/2018/05/02/if-researchgate-is-where-authors-connect-and-collaborate/?informz=1
Update to Ruth’s Rankings 29 on geopolitics: Read Naidoo, Rajani (20April 2018) World class systems rather than World class universities. University World News (502) accessed at http://www.universityworldnews.com/article.php?story=20180417162622337
New Rankings from Times Higher Education: The 10 most beautiful campuses in East Asia, accessed 29 April 2018. https://www.timeshighereducation.com/student/best-universities/10-most-beautiful-universities-east-asia
RESOURCES:
Background articles for Clarivate Analytics:
Clarivate Analytics White Paper. (2017) Look up to the brightest stars accessed at https://clarivate.com/hcr/wp-content/uploads/2017/11/2017-Highly-Cited-Researchers-Report-1.pdf
Clarivate Analytics names the world’s 3,300 most impactful scientific researchers with the release of the 2017 Highly Cited Researchers list. Half have secondary affiliations elsewhere. https://www.prnewswire.com/news-releases/clarivate-analytics-names-the-worlds-most-impactful-scientific-researchers-with-the-release-of-the-2017-highly-cited-researchers-list-300556259.html
Mering, Margaret (2017). Correctly linking researchers to their journal articles: An overview of unique author identifiers. Serials Review, 43(3-4), 265-273.
Thanks to Miyairi, Nobuko, ORCID regional director Asia-Pacific (February 2018) for her help understanding ORCID.
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is emeritus faculty librarian at Emory University. After working at Emory she was the founding librarian of the Li Ka Shing Library at Singapore Management University and then adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674