Adjusting the U.S. Census of 1990
Loss Functions,
David A. Freedman,
Kenneth W. Wachter,
D. Richard Cutler and
Stephen P. Klein
Additional contact information
David A. Freedman: University of California, Berkeley
Kenneth W. Wachter: University of California, Berkeley
D. Richard Cutler: Utah State University
Stephen P. Klein: RAND Corporation
Evaluation Review, 1994, vol. 18, issue 3, 243-280
Abstract:
Considering the difficulties, the Census Bureau does a remarkably good job at counting people. This article discusses techniques for adjusting the census. If there is a large undercount, these techniques may be accurate enough for adjustment. With a small undercount, adjustment could easily degrade the accuracy of the data. The Bureau argued that errors in the census were more serious than errors in the proposed adjustment, using "loss function analysis" to balance the risks. This procedure turns out to depend on quite unreasonable assumptions. With other and more realistic assumptions, the balance favors the census. The story has a broader moral. Statistical models are often defended on grounds of robustness. However, internally generated measures of precision may be critical. If the model is at all complicated, these measures of precision may turn out to be driven by assumptions not data—the antithesis of robustness.
Date: 1994
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0193841X9401800301 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:evarev:v:18:y:1994:i:3:p:243-280
DOI: 10.1177/0193841X9401800301
Access Statistics for this article
More articles in Evaluation Review
Bibliographic data for series maintained by SAGE Publications ().