Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute
Michael Bamberger,
Michele Tarsilla and
Sharlene Hesse-Biber
Evaluation and Program Planning, 2016, vol. 55, issue C, 155-162
Abstract:
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most “rigorous,” does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an “RCT+” design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies.
Keywords: Mixed-methods; Uninintended consequences; Evaluation design; Randomized control trials (search for similar items in EconPapers)
Date: 2016
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (10)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0149718916000021
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:epplan:v:55:y:2016:i:c:p:155-162
DOI: 10.1016/j.evalprogplan.2016.01.001
Access Statistics for this article
Evaluation and Program Planning is currently edited by Jonathan A. Morell
More articles in Evaluation and Program Planning from Elsevier
Bibliographic data for series maintained by Catherine Liu ().