What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports
Darko Hren,
David G. Pina,
Christopher R. Norman and
Ana Marušić
Journal of Informetrics, 2022, vol. 16, issue 2
Abstract:
The evaluation of grant proposals is an essential aspect of competitive research funding. Funding bodies and agencies rely in many instances on external peer reviewers for grant assessment. Most of the research available is about quantitative aspects of this assessment, and there is little evidence from qualitative studies. We used a combination of machine learning and qualitative analysis methods to analyse the reviewers' comments in evaluation reports from 3667 grant applications to the Initial Training Networks (ITN) of the Marie Curie Actions under the Seventh Framework Programme (FP7). Our results show that the reviewers' comments for each evaluation criterion were aligned with the Action's prespecified criteria and that the evaluation outcome was more influenced by the proposals’ weaknesses than by their strengths.
Keywords: European Commission; Machine learning; Marie Curie Actions; Peer review outcome; Qualitive analysis; Research grants (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S1751157722000414
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:infome:v:16:y:2022:i:2:s1751157722000414
DOI: 10.1016/j.joi.2022.101289
Access Statistics for this article
Journal of Informetrics is currently edited by Leo Egghe
More articles in Journal of Informetrics from Elsevier
Bibliographic data for series maintained by Catherine Liu ().