EconPapers    
Economics at your fingertips  
 

Bridging to Action Requires Mixed Methods, Not Only Randomised Control Trials

Wendy Olsen ()
Additional contact information
Wendy Olsen: University of Manchester

The European Journal of Development Research, 2019, vol. 31, issue 2, No 1, 139-162

Abstract: Abstract Development evaluation refers to evaluating projects and programmes in development contexts. Some evaluations are too narrow. Narrow within-discipline impact evaluations are weaker than multidisciplinary, mixed-methods evaluations. A two-step process leads toward profoundly better arguments in assessing the impact of a development intervention. The first step is setting out the arena for discussion, including what the various entities are in the social, political, cultural and natural environment surrounding the chosen problem. The second step is that, once this arena has been declared, the project and triangulation of data can be brought to bear upon logical arguments with clear, transparent reasoning leading to a set of conclusions. In this second step, we do need scientific methods such as peer review, data and so on, but, crucially, the impact evaluation process must not rest upon a single data type, such as survey data. It is dangerous and undesirable to have the entire validity of the conclusions resting upon randomised control trials, or even a mixture of data types. Different contributions to knowledge exist within the evaluation process, including the interaction of people during action research, ethnography, case-study methods, process tracing and qualitative methods. The cement holding my argument together is that multiple logics are used (retroductive, deductive, and inductive, in particular). Deductive mathematics should not dominate the evaluation of an intervention, as randomised controlled trials on their own lend themselves to worrying fallacies about causality. I show this using Boolean fuzzy set logic. An indicator of high-quality development evaluation is the use of multiple logics in a transparent way.

Keywords: Evaluation; Randomised control trials; Comparative case-study research; Methodology; Retroduction; Mixed-methods; Impact evaluation (search for similar items in EconPapers)
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (7)

Downloads: (external link)
http://link.springer.com/10.1057/s41287-019-00201-x Abstract (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:pal:eurjdr:v:31:y:2019:i:2:d:10.1057_s41287-019-00201-x

Ordering information: This journal article can be ordered from
http://www.springer.com/journal/41287/PS2

DOI: 10.1057/s41287-019-00201-x

Access Statistics for this article

The European Journal of Development Research is currently edited by Spencer Henson and Natalia Lorenzoni

More articles in The European Journal of Development Research from Palgrave Macmillan, European Association of Development Research and Training Institutes (EADI) Contact information at EDIRC.
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:pal:eurjdr:v:31:y:2019:i:2:d:10.1057_s41287-019-00201-x