A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications
Lutz Bornmann (),
Loet Leydesdorff and
Peter Van den Besselaar
Journal of Informetrics, 2010, vol. 4, issue 3, 211-220
Abstract:
Combining different data sets with information on grant and fellowship applications submitted to two renowned funding agencies, we are able to compare their funding decisions (award and rejection) with scientometric performance indicators across two fields of science (life sciences and social sciences). The data sets involve 671 applications in social sciences and 668 applications in life sciences. In both fields, awarded applicants perform on average better than all rejected applicants. If only the most preeminent rejected applicants are considered in both fields, they score better than the awardees on citation impact. With regard to productivity we find differences between the fields. While the awardees in life sciences outperform on average the most preeminent rejected applicants, the situation is reversed in social sciences.
Keywords: Grant allocation; Peer review; Bibliometric quality indicators; Convergent validity and predictive validity; Error; Citation rate; h-Index (search for similar items in EconPapers)
Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (23)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S1751157709000789
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:infome:v:4:y:2010:i:3:p:211-220
DOI: 10.1016/j.joi.2009.10.004
Access Statistics for this article
Journal of Informetrics is currently edited by Leo Egghe
More articles in Journal of Informetrics from Elsevier
Bibliographic data for series maintained by Catherine Liu ().