Automated citation recommendation tools encourage questionable citations
Serge P J M Horbach,
Freek J W Oude Maatman,
Willem Halffman and
Wytske M Hepkema
Research Evaluation, 2022, vol. 31, issue 3, 321-325
Abstract:
Citing practices have long been at the heart of scientific reporting, playing both socially and epistemically important functions in science. While such practices have been relatively stable over time, recent attempts to develop automated citation recommendation tools have the potential to drastically impact citing practices. We claim that, even though such tools may come with tempting advantages, their development and implementation should be conducted with caution. Describing the role of citations in science’s current publishing and social reward structures, we argue that automated citation tools encourage questionable citing practices. More specifically, we describe how such tools may lead to an increase in: perfunctory citation and sloppy argumentation; affirmation biases; and Matthew effects. In addition, a lack of transparency of the tools’ underlying algorithmic structure renders their usage problematic. Hence, we urge that the consequences of citation recommendation tools should at least be understood and assessed before any attempts to implementation or broad distribution are undertaken.
Keywords: citation recommendation tools; research evaluation; citing practices; Matthew effect; bias (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1093/reseval/rvac016 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:rseval:v:31:y:2022:i:3:p:321-325.
Access Statistics for this article
Research Evaluation is currently edited by Julia Melkers, Emanuela Reale and Thed van Leeuwen
More articles in Research Evaluation from Oxford University Press
Bibliographic data for series maintained by Oxford University Press ().