SSH researchers make an impact differently. Looking at public research from the perspective of users
ISRIA Statement. Ten Point Guidelines for an Effective Process of Research Impact Assessment
Andrea Bonaccorsi,
Filippo Chiarello and
Gualtiero Fantoni
Research Evaluation, 2021, vol. 30, issue 3, 269-289
Abstract:
With the rise of the impact assessment revolution, governments and public opinion have started to ask researchers to give evidence of their impact outside the traditional audiences, i.e. students and researchers. There is a mismatch between the request to demonstrate the impact and the current methodologies for impact assessment. This mismatch is particularly worrisome for the research in Social Sciences and Humanities. This paper gives a contribution by examining systematically a key element of impact, i.e. the social groups that are directly or indirectly affected by the results of research. We use a Text mining approach applied to the Research Excellence Framework (REF) collection of 6,637 impact case studies in order to identify social groups mentioned by researchers. Differently from previous studies, we employ a lexicon of user groups that includes 76,857 entries, which saturates the semantic field, permits the identification of all users and opens the way to normalization. We then develop three new metrics measuring Frequency, Diversity and Specificity of user expressions. We find that Social Sciences and Humanities exhibit a distinctive structure with respect to frequency and specificity of users.
Keywords: impact assessment; social sciences and humanities; text mining; lexicon; research users (search for similar items in EconPapers)
Date: 2021
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1093/reseval/rvab008 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:rseval:v:30:y:2021:i:3:p:269-289.
Access Statistics for this article
Research Evaluation is currently edited by Julia Melkers, Emanuela Reale and Thed van Leeuwen
More articles in Research Evaluation from Oxford University Press
Bibliographic data for series maintained by Oxford University Press ().