A letter on Ancaiani et al. ‘Evaluating scientific research in Italy: the 2004-10 research evaluation exercise’
Alberto Baccini and
Giuseppe De Nicolao
Research Evaluation, 2017, vol. 26, issue 4, 353-357
Abstract:
This letter documents some problems in Ancaiani et al. (2015). Namely the evaluation of concordance, based on Cohen's kappa, reported by Ancaiani et al. was not computed on the whole random sample of 9,199 articles, but on a subset of 7,597 articles. The kappas relative to the whole random sample were in the range 0.07–0.15, indicating an unacceptable agreement between peer review and bibliometrics. The subset was obtained by non-random exclusion of all articles for which bibliometrics produced an uncertain classification; these raw data were not disclosed, so that concordance analysis is not reproducible. The VQR-weighted kappa for Area 13 reported by Ancaiani et al. is higher than that reported by Area 13 panel and confirmed by Bertocchi et al. (2015), a difference explained by the use, under the same name, of two different set of weights. Two values of kappa reported by Ancaiani et al. differ from the corresponding ones published in the official report. Results reported by Ancaiani et al. do not support a good concordance between peer review and bibliometrics. As a consequence, the use of both techniques introduced systematic distortions in the final results of the Italian research assessment exercise. The conclusion that it is possible to use both technique as interchangeable in a research assessment exercise appears to be unsound, by being based on a misinterpretation of the statistical significance of kappa values.
Keywords: informed peer review; research assessment; bibliometric evaluation; Italian VQR; peer review; Cohen’s kappa (search for similar items in EconPapers)
Date: 2017
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://hdl.handle.net/10.1093/reseval/rvx013 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:rseval:v:26:y:2017:i:4:p:353-357.
Access Statistics for this article
Research Evaluation is currently edited by Julia Melkers, Emanuela Reale and Thed van Leeuwen
More articles in Research Evaluation from Oxford University Press
Bibliographic data for series maintained by Oxford University Press ().