RD or Not RD: Using Experimental Studies to Assess the Performance of the Regression Discontinuity Approach
Philip Gleason,
Alexandra Resch and
Jillian Berk
Evaluation Review, 2018, vol. 42, issue 1, 3-33
Abstract:
Background: This article explores the performance of regression discontinuity (RD) designs for measuring program impacts using a synthetic within-study comparison design. We generate synthetic RD data sets from experimental data sets from two recent evaluations of educational interventions—the Educational Technology Study and the Teach for America Study—and compare the RD impact estimates to the experimental estimates of the same intervention. Objectives: This article examines the performance of the RD estimator with the design is well implemented and also examines the extent of bias introduced by manipulation of the assignment variable in an RD design. Research design: We simulate RD analysis files by selectively dropping observations from the original experimental data files. We then compare impact estimates based on this RD design with those from the original experimental study. Finally, we simulate a situation in which some students manipulate the value of the assignment variable to receive treatment and compare RD estimates with and without manipulation. Results and conclusion: RD and experimental estimators produce impact estimates that are not significantly different from one another and have a similar magnitude, on average. Manipulation of the assignment variable can substantially influence RD impact estimates, particularly if manipulation is related to the outcome and occurs close to the assignment variable’s cutoff value.
Keywords: methodological development; content area; quasi-experimental design; methodology (if appropriate); outcome evaluation (other than economic evaluation); design and evaluation of programs and policies; education (search for similar items in EconPapers)
Date: 2018
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0193841X18787267 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:evarev:v:42:y:2018:i:1:p:3-33
DOI: 10.1177/0193841X18787267
Access Statistics for this article
More articles in Evaluation Review
Bibliographic data for series maintained by SAGE Publications ().