By Ruth A. Pagell*
(12 January 2016) Citation counts are core bibliometric indicators used by rankers to measure scholarly impact and administrators to measure faculty output. Questions that follow are:
Is being cited enough, or do you have to be cited in a high impact source?
And if a “high impact” source is desired, how is it determined and should it be a factor in ranking institutions?
The answer is the “Journal Impact Factor” a center of controversy among authors who write about bibliometrics and authors who are concerned with its use or misuse at an individual level for determining hiring, promotion and tenure.
Metric Tide (Wilsdon, 2015), a review prepared for HEFCE (Higher Education Funding Council for England), highlights concerns over citation counts and journal impact factors. Section 6.2 deals with university rankings and league tables and raises all the issues that we have discussed throughout this series. Following up on this theme Karin Wulf (2015) in a Scholarly Kitchen post asks “If We don’t Know What Citations Mean, What Does it Mean When We Count Them?”
This article is limited to clarifying journal citation metrics in relation to publication quality and peripherally, institutional rankings. To be consistent with terminology I use articles for individual documents, such as articles, proceedings, books, etc that are in publications which are the journals, proceedings, books, etc that are being cited.
To be ranked, institutions need a corpus of articles and more importantly articles that have been cited by some other publication. Garfield (1955, 2006) introduced the concept of Journal Impact in 1955 as a way to select publications for inclusion in Science Citation Index. He re-examined its current usage fifty years later. Since its first edition in 1975, Journal Citation Reports has tracked individual journals within fields, adding publications and metrics over the years.
The following are Issues surrounding citations and journal impact:
- Variations in journal lists and ranks year on year
- Impact rank depends on category
- Different providers of journal impact use different categories
- Impact rank depends on number of journals in category
- Annual computation in JCR ( but ongoing from Scopus sources)
Questions surrounding citations and journal impact:
- Are we interested in the impact at the time the article is published? Over a variable time period, since some disciplines are cited more quickly than others?
- What about the impact of the citing journals? The reputation of the citing institutions?
- Does publishing in a high impact journal signify a high impact article?
- Are there other measures that better represent the quality of a publication?
Another issue in citation counts at the individual level, identified in a broadcast on the U.S. National Public Radio (Vedantam, Feenberg), is that order matters. Articles or authors’ names that are first in a list (which is often alphabetical) get cited more than those that follow.
Most rankings in Ruth’s Rankings use some measure(s) that incorporates citations into their methodology. The weightings, years covered, type of articles included, the metrics and the definitions may vary.
METHODOLOGY FOR CITATION AND JOURNAL IMPACT
The impact factor for publications is based on a ratio of documents published during a fixed time period to citations for those documents during a fixed time period. Using JCR as an example, the journal impact for Journal of the American Society for Information Science and Technology (JASIST) is 1.846. Its category, Information Science & Library Science, impact is 1.38 while the impact factor for its other category, Computer Science Information Systems, is 1.56 To calculate the impact for the journal or category, the following formula is used as in this example for JASIST. See Example 16.1 (in pdf) for the effect of subject on two different journals.
Cites in 2104 to articles published in: Number of items published in:
2013: 232 2013: 186
2012: 451 2012: 184
Sum: 683 Sum: 370
Calculation: Recent cites 683
Recent articles 370
SOURCES FOR CITATION AND JOURNAL METRICS
Underlying data come from Thomson – Reuters Web of Science and Elsevier’s Scopus. Google Scholar entered the “journal impact” arena, using a five year h-index. Each source bases the number of citations from publications within that source. Scopus only counts citations from publications covered by Scopus and JCR only counts citations from publications covered in JCR. Google Scholar counts all “citations” retrieved in Scholar. See Table 16.1 Journal Metrics by Analyzing Source for the different metrics available from these sources. The Table also includes the applicable Snowball metrics.
Thomson Reuters Journal Citation Reports
Web of Science (WOS) data appear in two add-on products, Journal Citation Reports (JCR) and InCites Analytics. WOS provides data for individual authors and publications including an h-index for the publication (limited to 10,000 articles) and a new metric for an article, usage count. If your institution subscribes to JCR, you can access the impact factor for the publication for any article you retrieve. JCR has migrated to the Incites platform which interfaces with WOS. WOS links to a second subscription source for publication analysis in its Incites Analytic product under “Publications.” This iteration disaggregates the journal and applies subject categorizes at the article level. Several articles from Nature appear under Information & Library Science category.
With JCR on the InCites platform, you can compare publications by categories and countries. JCR incorporates the Eigenfactor, also available freely on the web, in which journals are “influential” if they are cited often by other influential journals.
For more in depth information about Journal Citation Reports see a reprint of “Insights into InCites” (Pagell, 2014) and the state of journal evaluation. – “Understanding the journal impact factor and other key metrics” (Testa 2015).
Journal Metrics Supported by Elsevier
A SCOPUS search provides citation analysis at an author level, including total citations and an h-index. At the article level, Scopus includes field weighted citation impact and rank in field normalized for the field and date and type of publication. It also includes alternative metrics such as Mendeley readers and Altmetrics’ number of tweets, blog posts and mass media stories.
Unlike Thomson Reuters, Elsevier does not perform its own journal analytics. It integrates the analyses from SCImago’s Journal Rank (SJR) and CWTS’ SNIP (Source Normalized Citation Impact) and IPP (Impact per publication). I advise users to go directly to the open access websites for SJR Journal Ranking or CWTS Journal Indicators websites.
SJR is the most complete free journal ranking site. You can download the dataset of over 12,000 publications with at least 100 cites out of the total dataset of almost 23,000 publications . SJR includes a composite score. The algorithm is available for those who are mathematically inclined, You can also search by subject and country.
CWTS calculates two journal rankings, SNIP (Source Normalized Impact per Paper) and IPP (Impact Per Publication). SNIP and IPP are based on journals in SCOPUS. SNIP weights citations based on subject field and accounts for differences among journal types such as basic vs applied (IMoed, 2010). The dataset is available for downloading. It does not include number of publications.
Google Scholar uses a five year h-index and median h-index for its top 100 journals and for twenty top journals in a variety of broad and specific subfields, based on what Google considers “citations”. The number varies based on how your browser is configured for publication inclusion.
The h-index is the intersection of the number of citations listed in descending order with the rank of the citation. Over the five year time period 2011-2015, MIS Quarterly published 287 articles indexed in Web of Science, where I have access to underlying data, and an h-index of 30. Arrange the articles in descending order based on the number of citations, and the 30th document has 30 citations. Scholar gives MIS !Quarterly an h-index for the same time period as 74 because it is using a different citations list.
The h-index has its own issues in addition to varying from source to source. It understates highly cited articles and understates all uncited articles. h-index.
JOURNAL METRICS IN RANKINGS
Nature Index and ARWU do not use any citation metrics although they both include articles from Nature.
Of all the rankers, NTU-Taiwan puts the most emphasis on citations. 75% of its metrics are based on total citation counts, including 15% from the number of articles from high impact journals for the past year of data. High impact journals are defined as those in the top 5% of their field as calculated by JCR. “Social Sciences” is included as a broad ranking category but does not include the JCR subcategories.
CONCLUSION
The metrics to measure journal quality have their weaknesses. Example 16.2 (in pdf) shows the differences in ranking using the basic metrics from the InCites Platform and Journal Impact from JCR which includes different article types and different time ranges.
Personally, I do not dismiss the process as is suggested in Mustafa’s (2015 ) cry about the “Disaster of the Impact Factor. ” Many of the problems are not in the metrics but in the system which at an institutional level encourages faculty to publish in high impact journals. Bibliometricians are working on creating metrics that account for the known differences in citation counts. The critics offer no viable alternatives for maintaining quality in an environment of increasing numbers of journals at the same time that articles are dis-aggregated from publications.
Only when you can see a wide range of data, such as in the two subscription Thomson-Reuters packages or in free SJR, can you make your own informed decision on what journals are impactful for you based on metric, subject and country.
NOTES
See Table 16.2, which compares journals across the ranking sources.
For an easy read on journal metrics, see Nature’s On Metrics.
REFERENCES
Used in the article
Ellengaad O, Wallin JA (Dec 2015). The bibliometric analysis of scholarly production: How great is the impact? Scientometrics 105 (3)1809: 1831
Feenberg, D., Ganguli. I., Gaule, P., and Gruber,J. It’s good to be first: Order bias in reading and citing NBER working papers (2015). NBER Working Paper no 21141, May 2015, accessed 9 January 2016 at http://www.nber.org/papers/w21141
Garfield (1955). Citation Indexes for Science. Science, New Series, 122, (3159) (Jul. 15, 1955), pp. 108-111.
Garfield, E. (2006). The history and meaning of the journal impact factor. Journal of the American Medical Assocation, 295(1), 90-93.
Gonzalez-Pereira, B., Guerrero-Bote, VP and Moya-Anegón, F. (2012). A further step forward in measuring journals’ scientific prestige: The SJR2 indicator. Journal of Informetrics 6(4) 674-688 retrieved 8/1/ 2015 http://arxiv.org/ftp/arxiv/papers/0912/0912.4141.pdf.
Medium.com (2015) Should your academic CV (or résumé) go digital, at last? (2015) Writing for research April 27 20, accessed 9 January 2016 https://medium.com/advice-and-help-in-authoring-a-phd-or-non-fiction/should-your-resum%C3%A9-or-cv-go-digital-at-last-23ef784c013b#.z7eqatu4i .
Moed, H. (2010). Measuring contextual citation impact of scientific journals, Journal of Informetrics, 4 (2010), pp 256-277.
Moustafa, K. (2014). The disaster of the impact factor. Science and Engineering Ethics. 21(1) 139-142 accessed 9 January, 2016 at http://philpapers.org/archive/MOUTDO-2.pdf .
On Impact (2015) Nature Methods 12, 893, published online 30 June 1015, accessed 08/01/2016 t http://www.nature.com/nmeth/journal/v12/n8/full/nmeth.3520.html doi:10.1038/nmeth.3520.
Pagell, R. A. (2014) Insights into InCites. Online Searcher 38 (6) 16-19 available at https://librarylearningspace.com/wp-content/uploads/2015/05/Insights-Into-InCites.pdf , accessed 9 January 2016.
Testa, J ((2015 ) The state of journal evaluation. – Understanding the journal impact factor and other key metrics accessed 26 December, 2015 http://www.diglib.um.edu.my/interaktif/edocs/State%20of%20Journal%20Evaluation.pdf.
Vedantam, S. (2015) No 1 with a bullet point: To get research cited, make sure it’s listed first. National Public Radio, July 15 2015 accessed 9 January 2016 at http://www.npr.org/2015/07/15/423101360/no-1-with-a-bullet-point-to-get-research-cited-make-sure-its-listed-first.
Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. DOI: 10.13140/RG.2.1.4929.1363 accessed 10 January at http://www.hefce.ac.uk/pubs/rereports/Year/2015/metrictide/Title,104463,en.html.
Wouters, P. et al. (2015). The Metric Tide: Literature Review (Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment and Management). HEFCE. DOI: 10.13140/RG.2.1.5066.3520.
Wulf, Karin (4 August , 2015) If we don’t know what citations mean, why do we count them? Scholarly Ktichen accessed 26 December 2015 http://scholarlykitchen.sspnet.org/2015/08/04/if-we-dont-know-what-citations-mean-what-does-it-mean-when-we-count-them/.
Extracted From Metric Tide Literature Review
Abramo, G., D’Angelo, C. A., & Di Costa, F. (2010). Citations versus journal impact factor as proxy of quality: Could the latter ever be preferable? Scientometrics, 84(3), 821-833.
Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635-649.
Bollen, J., Van De Sompel, H., Smith, J. A., & Luce, R. (2005). Toward alternative metrics of journal impact: A comparison of download and citation data. Information Processing & Management, 41(6), 1419-1440.
Campanario, J. M. (2011). Empirical study of journal impact factors obtained using the classical two-year citation window versus a five-year citation window. Scientometrics, 87(1), 189-204.
Elkins, M. R., Maher, C. G., Herbert, R. D., Moseley, A. M., & Sherrington, C. (2010). Correlation between the journal impact factor and three other journal citation indices. Scientometrics, 85(1), 81-93.
Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171-193.
Harzing, A. W. & Van der Wal, R. (2009). A Google Scholar h‐index for journals: An alternative metric to measure journal impact in economics and business. Journal of the American Society for Information Science and Technology, 60(1), 41-46.
Leydesdorff, L., & Opthof, T. (2010). Scopus’s source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology, 61(11), 2365-2369.
Leydesdorff, L., Zhou, P., & Bornmann, L. (2013a). How can journal impact factors be normalized across fields of science? An assessment in terms of percentile ranks and fractional counts. Journal of the American Society for Information Science and Technology, 64(1), 96-107.
Moed, H. F., & Van Leeuwen, T. N. (1995). Improving the accuracy of Institute for Scientific Information’s journal impact factors. Journal of the American Society for Information Science, 46(6), 461-467.
Mutz, R., & Daniel, H. D. (2012a). The generalized propensity score methodology for estimating unbiased journal impact factors. Scientometrics,92(2), 377-390.
Mutz, R., & Daniel, H. D. (2012b). Skewed citation distributions and bias factors: Solutions to two core problems with the journal impact factor. Journal of Informetrics, 6(2), 169-176.
Seglen, P. O. (1994). Causal relationship between article citedness and journal impact. Journal of the American Society for Information Science, 45(1), 1-11.
Ruth’s Rankings
A list of Ruth’s Rankings and News Updates is here.
*Ruth A. Pagell is currently an adjunct faculty [teaching] in the Library and Information Science Program at the University of Hawaii. Before joining UH, she was the founding librarian of the Li Ka Shing Library at Singapore Management University. She has written and spoken extensively on various aspects of librarianship, including contributing articles to ACCESS – https://orcid.org/0000-0003-3238-9674 .