The Mechanics of Treatment-effect Estimate Bias for Nonexperimental Data
Roberto V. Penaloza and
Mark Berends
Sociological Methods & Research, 2022, vol. 51, issue 1, 165-202
Abstract:
To measure “treatment†effects, social science researchers typically rely on nonexperimental data. In education, school and teacher effects on students are often measured through value-added models (VAMs) that are not fully understood. We propose a framework that relates to the education production function in its most flexible form and connects with the basic VAMs without using untenable assumptions. We illustrate how, due to measurement error (ME), cross-group imbalances created by nonrandom group assignment cause correlations that drive the models’ treatment-effect estimate bias. We derive formulas to calculate bias and rank the models and show that no model is better in all situations. The framework and formulas’ workings are verified and illustrated via simulation. We also evaluate the performance of latent variable/errors-in-variables models that handle ME and study the role of extra covariates including lags of the outcome.
Keywords: treatment-effect estimate bias; value-added models; teacher effectiveness; school effectiveness; measurement error (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0049124119852375 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:somere:v:51:y:2022:i:1:p:165-202
DOI: 10.1177/0049124119852375
Access Statistics for this article
More articles in Sociological Methods & Research
Bibliographic data for series maintained by SAGE Publications ().