Push button replication: Is impact evaluation evidence for international development verifiable?
Benjamin Wood,
Rui Müller and
Annette Brown
No n7a4d, OSF Preprints from Center for Open Science
Abstract:
Objective: In past years, research audit exercises conducted across several fields of study have found a high prevalence of published empirical research that cannot be reproduced using the original dataset and software code (replication files). The failure to reproduce arises either because the original authors refuse to make replication files available or because third party researchers are unable to produce the published results using the provided files. Both causes create a credibility challenge for empirical research, as it means those published findings are not verifiable. In recent years, increasing numbers of journals, funders, and academics have embraced research transparency, which should reduce the prevalence of failures to reproduce. This study reports the results of a research audit exercise, known as the push button replication (PBR) project, which tested a sample of studies published in 2014 that use similar empirical methods but span a variety of academic fields. Methods: To draw our sample of articles, we used the 3ie Impact Evaluation Repository to identify the ten journals that published the most impact evaluations (experimental and quasi-experimental intervention studies) from low- and middle-income countries from 2010 through 2012. This set includes health, economics, and development journals. We then selected all articles in these journals published in 2014 that meet the same inclusion criteria. We developed and piloted a detailed protocol for conducting push button replication and determining the level of comparability of the replication findings to the original. To ensure all materials and processes for the PBR project were transparent, we established a project site on the Open Science Framework. We divided the sample of articles across several researchers who followed the protocol to request data and conduct the replications. Results: Of the 109 articles in our sample, only 27 are push button replicable, meaning the provided code run on the provided dataset produces comparable findings for the key results in the published article. The authors of 59 of the articles refused to provide replication files. Thirty of these 59 articles were published in journals that had replication file requirements in 2014, meaning these articles are non-compliant with their journal requirements. For the remaining 23 articles, we confirmed that three had proprietary data, we received incomplete replication files for 15, and we found minor differences in the replication results for five. We found open data for only 14 of the articles in our sample.
Date: 2018-06-19
New Economics Papers: this item is included in nep-sog
References: Add references at CitEc
Citations: View citations in EconPapers (4)
Downloads: (external link)
https://osf.io/download/5b29734804fe52000ecb92ac/
Related works:
Journal Article: Push button replication: Is impact evaluation evidence for international development verifiable? (2018) 
Working Paper: Push button replication: Is impact evaluation evidence for international development verifiable? (2018) 
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:n7a4d
DOI: 10.31219/osf.io/n7a4d
Access Statistics for this paper
More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().