The Prevalence and Severity of Underreporting Bias in Machine- and Human-Coded Data
Benjamin E. Bagozzi,
Patrick T. Brandt,
John R. Freeman,
Jennifer S. Holmes,
Alisha Kim,
Agustin Palao Mendizabal and
Carly Potz-Nielsen
Political Science Research and Methods, 2019, vol. 7, issue 3, 641-649
Abstract:
Textual data are plagued by underreporting bias. For example, news sources often fail to report human rights violations. Cook et al. propose a multi-source estimator to gauge, and to account for, the underreporting of state repression events within human codings of news texts produced by the Agence France-Presse and Associated Press. We evaluate this estimator with Monte Carlo experiments, and then use it to compare the prevalence and seriousness of underreporting when comparable texts are machine coded and recorded in the World-Integrated Crisis Early Warning System dataset. We replicate Cook et al.’s investigation of human-coded state repression events with our machine-coded events, and validate both models against an external measure of human rights protections in Africa. We then use the Cook et al. estimator to gauge the seriousness and prevalence of underreporting in machine and human-coded event data on human rights violations in Colombia. We find in both applications that machine-coded data are as valid as human-coded data.
Date: 2019
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.cambridge.org/core/product/identifier/ ... type/journal_article link to article abstract page (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:cup:pscirm:v:7:y:2019:i:03:p:641-649_00
Access Statistics for this article
More articles in Political Science Research and Methods from Cambridge University Press Cambridge University Press, UPH, Shaftesbury Road, Cambridge CB2 8BS UK.
Bibliographic data for series maintained by Kirk Stebbing ().