It’s All About MeE: Using Structured Experiential Learning (‘e’) to Crawl the Design Space
Lant Pritchett,
Salimah Samji (salimah_samji@hks.harvard.edu) and
Jeffrey Hammer
Additional contact information
Salimah Samji: Center for International Development at Harvard University
No 249, CID Working Papers from Center for International Development at Harvard University
Abstract:
There is an inherent tension between implementing organizations—which have specific objectives and narrow missions and mandates—and executive organizations—which provide resources to multiple implementing organizations. Ministries of finance/planning/budgeting allocate across ministries and projects/programmes within ministries, development organizations allocate across sectors (and countries), foundations or philanthropies allocate across programmes/grantees. Implementing organizations typically try to do the best they can with the funds they have and attract more resources, while executive organizations have to decide what and who to fund. Monitoring and Evaluation (M&E) has always been an element of the accountability of implementing organizations to their funders. There has been a recent trend towards much greater rigor in evaluations to isolate causal impacts of projects and programmes and more ‘evidence base’ approaches to accountability and budget allocations. Here we extend the basic idea of rigorous impact evaluation—the use of a valid counter-factual to make judgments about causality—to emphasize that the techniques of impact evaluation can be directly useful to implementing organizations (as opposed to impact evaluation being seen by implementing organizations as only an external threat to their funding). We introduce structured experiential learning (which we add to M&E to get MeE) which allows implementing agencies to actively and rigorously search across alternative project designs using the monitoring data that provides real time performance information with direct feedback into the decision loops of project design and implementation. Our argument is that within-project variations in design can serve as their own counter-factual and this dramatically reduces the incremental cost of evaluation and increases the direct usefulness of evaluation to implementing agencies. The right combination of M, e, and E provides the right space for innovation and organizational capability building while at the same time providing accountability and an evidence base for funding agencies.
Keywords: Evaluation; Monitoring; Learning; Experimentation; Implementation; Feedback Loops (search for similar items in EconPapers)
Date: 2012-12
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
https://www.hks.harvard.edu/sites/default/files/ce ... rking-papers/249.pdf (application/pdf)
Related works:
Working Paper: It‘s All About MeE: Using Structured Experiential Learning (“e”) to Crawl the Design Space (2013) 
Working Paper: It's All about MeE: Using Structured Experiential Learning ("e") to Crawl the Design Space (2013) 
Working Paper: It's All About MeE: Using Structured Experiential Learning ('e') to Crawl the Design Space (2012) 
Working Paper: It's All about MeE: Using Structured Experiential Learning ('e') to Crawl the Design Space (2012) 
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:cid:wpfacu:249
Access Statistics for this paper
More papers in CID Working Papers from Center for International Development at Harvard University 79 John F. Kennedy Street. Contact information at EDIRC.
Bibliographic data for series maintained by Chuck McKenney (chuck_mckenney@hks.harvard.edu).