Making the cut: How panel reviewers use evaluation devices to select applications at the European Research Council
Lucas Brunet and
Ruth Müller
Research Evaluation, 2022, vol. 31, issue 4, 486-497
Abstract:
The European Research Council (ERC) receives many high-quality applications, but funds only a few. We analyze how members of ERC review panels assess applications in the first, highly competitive step of evaluations for ERC Starting and Consolidator Grants. Drawing on interviews with ERC panel members in different fields, we show that they adopt a set of evaluation devices that offer pragmatic and standardized ways of evaluating in a time-constrained and highly competitive setting. Through the use of evaluation devices, panel reviewers enact and generate a distinct reviewing expertise that encompasses subject-specific knowledge and knowledge about how to accomplish evaluation within a situated setting. We find that ERC panel reviewers employ four evaluation devices during the first step of ERC reviews: first, reviewers base judgments on applicants’ prior achievements (delegation devices); second, they adjust their evaluations of individual applications to the quality of a given set of applications (calibration devices); third, they combine multiple elements to assess the feasibility of proposals (articulation devices); and finally, they consider the impact of the proposed research on science and society (contribution devices). We show that the current use of these devices generates what we have termed evaluative pragmatism: a mode of reviewing that is shaped by and accommodated to the need to review many high-quality proposals in a short time period with possibly limited expert knowledge. In conclusion, we discuss how the prevalence of evaluative pragmatism in the first step of ERC panel reviews shapes candidate selection, particularly regarding human and epistemic diversity in European research.
Keywords: Peer review; European Research Council (ERC); Judgment devices; Evaluation devices; European Research Governance; Valuation studies; Reviewing expertise (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://hdl.handle.net/10.1093/reseval/rvac040 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:rseval:v:31:y:2022:i:4:p:486-497.
Access Statistics for this article
Research Evaluation is currently edited by Julia Melkers, Emanuela Reale and Thed van Leeuwen
More articles in Research Evaluation from Oxford University Press
Bibliographic data for series maintained by Oxford University Press ().