Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts
Lutz Bornmann and
Alexander Tekles
Journal of Informetrics, 2021, vol. 15, issue 3
Abstract:
This study focuses on a recently introduced type of indicator measuring disruptiveness in science. Disruptive research diverges from current lines of research by opening up new lines. In the current study, we included the initially proposed indicator of this new type (Funk & Owen-Smith, 2017; Wu, Wang, & Evans, 2019) and several variants with DI1: DI5, DI1n, DI5n, and DEP. Since indicators should measure what they propose to measure, we investigated the convergent validity of the indicators. We used a list of milestone papers, selected and published by editors of Physical Review Letters, and investigated whether this human (experts)-based list is related to values of the several disruption indicators variants and – if so – which variants show the highest correlation with expert judgements. We used bivariate statistics, multiple regression models, and (coarsened) exact matching (CEM) to investigate the convergent validity of the indicators. The results show that the indicators correlate differently with the milestone paper assignments by the editors. It is not the initially proposed disruption index that performed best (DI1), but the variant DI5 which has been introduced by Bornmann, Devarakonda, Tekles, and Chacko (2020a). In the CEM analysis of this study, the DEP variant – introduced by Bu, Waltman, and Huang (in press) – also showed favorable results.
Keywords: Bibliometrics; Convergent validity; Disruption index; Physical Review Letters (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (11)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S1751157721000304
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:infome:v:15:y:2021:i:3:s1751157721000304
DOI: 10.1016/j.joi.2021.101159
Access Statistics for this article
Journal of Informetrics is currently edited by Leo Egghe
More articles in Journal of Informetrics from Elsevier
Bibliographic data for series maintained by Catherine Liu ().