Analysing robustness and uncertainty levels of bibliometric performance statistics supporting science policy. A case study evaluating Danish postdoctoral funding
Jesper W. Schneider and
Thed N. van Leeuwen
Research Evaluation, 2014, vol. 23, issue 4, 285-297
Abstract:
We present main results from the bibliometric part of a recent evaluation of two different postdoctoral (postdoc)-funding instruments used in Denmark. We scrutinize the results for robustness, stability, and importance, and eventually come out questioning the official conclusions inferred from these results. Acknowledging the deficiencies of non-randomized designs and modelling of such data, we apply matching procedures to establish comparable groups and reduce systematic bias. In the absence of probability sampling, we refrain from using statistical inference. We demonstrate the usefulness of robustness analyses and effect size estimation in non-random, but carefully designed, descriptive studies. We examine whether there is a difference in long-term citation performance between groups of researchers funded by the two instruments and between the postdocs and a control group of researchers that has not received postdoc funding, but are otherwise comparable with the postdoc groups. The results show that all three groups perform well above the database average impact. We conclude that there is no difference in citation performance between the two postdoc groups. There is, however, a difference between the postdoc groups and the control group, but we argue that this difference is ‘trivial’. Our conclusion is different from the official conclusion given in the evaluation rapport, where the Research Council emphasizes the success of their funding programmes and neglects to mention the good performance of the basically tenure-tracked control group.
Date: 2014
References: Add references at CitEc
Citations: View citations in EconPapers (5)
Downloads: (external link)
http://hdl.handle.net/10.1093/reseval/rvu016 (application/pdf)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:oup:rseval:v:23:y:2014:i:4:p:285-297.
Access Statistics for this article
Research Evaluation is currently edited by Julia Melkers, Emanuela Reale and Thed van Leeuwen
More articles in Research Evaluation from Oxford University Press
Bibliographic data for series maintained by Oxford University Press ().