Verification servers: Enabling analysts to assess the quality of inferences from public use data
Jerome P. Reiter,
Anna Oganian and
Alan F. Karr
Computational Statistics & Data Analysis, 2009, vol. 53, issue 4, 1475-1482
Abstract:
To protect confidentiality, statistical agencies typically alter data before releasing them to the public. Ideally, although generally not done, the agency also provides a way for secondary data analysts to assess the quality of inferences obtained with the released data. Quality measures can help secondary data analysts to identify inaccurate conclusions resulting from the disclosure limitation procedures, as well as have confidence in accurate conclusions. We propose a framework for an interactive, web-based system that analysts can query for measures of inferential quality. As we illustrate, agencies seeking to build such systems must consider the additional disclosure risks from releasing quality measures. We suggest some avenues of research on limiting these risks.
Date: 2009
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (6)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-9473(08)00475-1
Full text for ScienceDirect subscribers only.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:53:y:2009:i:4:p:1475-1482
Access Statistics for this article
Computational Statistics & Data Analysis is currently edited by S.P. Azen
More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().