skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Do citations and readership identify seminal publications?

Abstract

Here, this work presents a new approach for analysing the ability of existing research metrics to identify research which has strongly influenced future developments. More specifically, we focus on the ability of citation counts and Mendeley reader counts to distinguish between publications regarded as seminal and publications regarded as literature reviews by field experts. The main motivation behind our research is to gain a better understanding of whether and how well the existing research metrics relate to research quality. For this experiment we have created a new dataset which we call TrueImpactDataset and which contains two types of publications, seminal papers and literature reviews. Using the dataset, we conduct a set of experiments to study how citation and reader counts perform in distinguishing these publication types, following the intuition that causing a change in a field signifies research quality. Finally, our research shows that citation counts work better than a random baseline (by a margin of 10%) in distinguishing important seminal research papers from literature reviews while Mendeley reader counts do not work better than the baseline.

Authors:
ORCiD logo [1]; ORCiD logo [2]; ORCiD logo [3]; ORCiD logo [2]
  1. Open Univ., Milton Keynes (United Kingdom); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
  2. Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
  3. Open Univ., Milton Keynes (United Kingdom)
Publication Date:
Research Org.:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
1425336
Grant/Contract Number:  
AC05-00OR22725
Resource Type:
Journal Article: Accepted Manuscript
Journal Name:
Scientometrics
Additional Journal Information:
Journal Volume: 115; Journal Issue: 1; Journal ID: ISSN 0138-9130
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS; Information retrieval; Scholarly communication; Publication datasets; Data mining; Research evaluation; Bibliometrics; Altmetrics

Citation Formats

Herrmannova, Drahomira, Patton, Robert M., Knoth, Petr, and Stahl, Christopher G. Do citations and readership identify seminal publications?. United States: N. p., 2018. Web. doi:10.1007/s11192-018-2669-y.
Herrmannova, Drahomira, Patton, Robert M., Knoth, Petr, & Stahl, Christopher G. Do citations and readership identify seminal publications?. United States. https://doi.org/10.1007/s11192-018-2669-y
Herrmannova, Drahomira, Patton, Robert M., Knoth, Petr, and Stahl, Christopher G. 2018. "Do citations and readership identify seminal publications?". United States. https://doi.org/10.1007/s11192-018-2669-y. https://www.osti.gov/servlets/purl/1425336.
@article{osti_1425336,
title = {Do citations and readership identify seminal publications?},
author = {Herrmannova, Drahomira and Patton, Robert M. and Knoth, Petr and Stahl, Christopher G.},
abstractNote = {Here, this work presents a new approach for analysing the ability of existing research metrics to identify research which has strongly influenced future developments. More specifically, we focus on the ability of citation counts and Mendeley reader counts to distinguish between publications regarded as seminal and publications regarded as literature reviews by field experts. The main motivation behind our research is to gain a better understanding of whether and how well the existing research metrics relate to research quality. For this experiment we have created a new dataset which we call TrueImpactDataset and which contains two types of publications, seminal papers and literature reviews. Using the dataset, we conduct a set of experiments to study how citation and reader counts perform in distinguishing these publication types, following the intuition that causing a change in a field signifies research quality. Finally, our research shows that citation counts work better than a random baseline (by a margin of 10%) in distinguishing important seminal research papers from literature reviews while Mendeley reader counts do not work better than the baseline.},
doi = {10.1007/s11192-018-2669-y},
url = {https://www.osti.gov/biblio/1425336}, journal = {Scientometrics},
issn = {0138-9130},
number = 1,
volume = 115,
place = {United States},
year = {Sat Feb 10 00:00:00 EST 2018},
month = {Sat Feb 10 00:00:00 EST 2018}
}

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record

Citation Metrics:
Cited by: 6 works
Citation information provided by
Web of Science

Figures / Tables:

Table 1 Table 1: Size of the dataset used in the study. The field Responses refers to how many responses we received in the data collection survey and the field Seminal/review/total papers refers to how many papers of each type the responses yielded.

Save / Share:

Works referenced in this record:

The Journal Impact Factor Denominator: Defining Citable (Counted) Items
journal, September 2009


Microsoft Academic (Search): a Phoenix arisen from the ashes?
journal, June 2016


The Anatomy of Impact: What Makes an Article Influential?
journal, March 1996


Web indicators for research evaluation. Part 2: Social media metrics
journal, September 2015


Web indicators for research evaluation. Part 1: Citations and links to academic articles from the Web
journal, September 2015


Differences in impact factor across fields and over time
journal, January 2009

  • Althouse, Benjamin M.; West, Jevin D.; Bergstrom, Carl T.
  • Journal of the American Society for Information Science and Technology, Vol. 60, Issue 1
  • https://doi.org/10.1002/asi.20936

What makes articles highly cited?
journal, February 2014


Coverage and adoption of altmetrics sources in the bibliometric community
journal, January 2014


Problems of citation analysis: A study of uncited and seldom-cited influences
journal, October 2009


Factors affecting citation rates of research articles: Factors Affecting Citation Rates of Research Articles
journal, May 2014


Does quality and content matter for citedness? A comparison with para-textual factors and over time
journal, July 2015


Automatic classification of citation function
conference, January 2006


The invariant distribution of references in scientific articles: The Invariant Distribution of References in Scientific Articles
journal, May 2015

  • Bertin, Marc; Atanassova, Iana; Gingras, Yves
  • Journal of the Association for Information Science and Technology, Vol. 67, Issue 1
  • https://doi.org/10.1002/asi.23367

Does the h-index for ranking of scientists really work?
journal, December 2005


Measuring Scientific Impact Beyond Citation Counts
journal, September 2016


Who reads research articles? An altmetrics analysis of Mendeley user categories
journal, April 2015

  • Mohammadi, Ehsan; Thelwall, Mike; Haustein, Stefanie
  • Journal of the Association for Information Science and Technology, Vol. 66, Issue 9
  • https://doi.org/10.1002/asi.23286

Citations versus journal impact factor as proxy of quality: could the latter ever be preferable?
journal, February 2010


Characteristics of highly cited papers
journal, December 2003


A review of the literature on citation impact indicators
journal, May 2016


When Knowledge Wins: Transcending the Sense and Nonsense of Academic Rankings
journal, March 2009


Problems with Traditional Science Publishing and Finding a Wider Niche for Post-Publication Peer Review
journal, October 2014


Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison
journal, November 2015


The power of altmetrics on a CV: The Power of Altmetrics on a CV
journal, April 2013


To better stand on the shoulder of giants
conference, January 2012


Citation Statistics
journal, February 2009


Mendeley readership counts: An investigation of temporal and disciplinary differences
journal, June 2015


What do citation counts measure? A review of studies on citing behavior
journal, January 2008


Priority criteria in peer review of scientific articles
journal, February 2016


The top 100 papers
journal, October 2014


Measuring academic influence: Not all citations are equal: Measuring Academic Influence
journal, May 2014


The rise and rise of citation analysis
journal, January 2007


Which people use which scientific papers? An evaluation of data from F1000 and Mendeley
journal, July 2015


Coverage and adoption of altmetrics sources in the bibliometric community
text, January 2014


Does the h-index for ranking of scientists really work?
text, January 2005


Works referencing / citing this record:

Which can better predict the future success of articles? Bibliometric indices or alternative metrics
journal, April 2019


Comparing the topological rank of journals in Web of Science and Mendeley
journal, July 2019