Panel Data Designs and Estimators as Substitutes for Randomized Controlled Trials in the Evaluation of Public Programs
Paul Ferraro and
Juan Jose Miranda
Journal of the Association of Environmental and Resource Economists, 2017, vol. 4, issue 1, 281 - 317
Abstract:
In the evaluation of public programs, experimental designs are rare. Researchers instead rely on observational designs. Observational designs that use panel data are widely portrayed as superior to time-series or cross-sectional designs because they provide opportunities to control for observable and unobservable variables correlated with outcomes and exposure to a program. The most popular panel data evaluation designs use linear, fixed-effects estimators with additive individual and time effects. To assess the ability of observational designs to replicate results from experimental designs, scholars use design replications. No such replications have assessed popular, fixed-effects panel data models that exploit repeated observations before and after treatment assignment. We implement such a study using, as a benchmark, results from a randomized environmental program that included effective and ineffective treatments. The popular linear, fixed-effects estimator fails to generate impact estimates or statistical inferences similar to the experimental estimator. Applying common flexible model specifications or trimming procedures also fail to yield accurate estimates or inferences. However, following best practices for selecting a nonexperimental comparison group and combining matching methods with panel data estimators, we replicate the experimental benchmarks. We demonstrate how the combination of panel and matching methods mitigates common concerns about specifying the correct functional form, the nature of treatment effect heterogeneity, and the way in which time enters the model. Our results are consistent with recent claims that design trumps methods in estimating treatment effects and that combining designs is more likely to approximate a randomized controlled trial than applying a single design.
Date: 2017
References: Add references at CitEc
Citations: View citations in EconPapers (49)
Downloads: (external link)
http://dx.doi.org/10.1086/689868 (application/pdf)
http://dx.doi.org/10.1086/689868 (text/html)
Access to the online full text or PDF requires a subscription.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ucp:jaerec:doi:10.1086/689868
Access Statistics for this article
More articles in Journal of the Association of Environmental and Resource Economists from University of Chicago Press
Bibliographic data for series maintained by Journals Division ().