EconPapers    
Economics at your fingertips  
 

Reducing “I Don’t Know†Responses and Missing Survey Data: Implications for Measurement

Deanna C. Denman, Austin S. Baldwin, Andrea C. Betts, Amy McQueen and Jasmin A. Tiro
Additional contact information
Deanna C. Denman: Department of Psychology, Southern Methodist University, Dallas, TX, USA
Austin S. Baldwin: Department of Psychology, Southern Methodist University, Dallas, TX, USA
Andrea C. Betts: Department of Clinical Sciences, University of Texas Southwestern Medical Center, Dallas, TX, USA
Amy McQueen: Division of General Medical Sciences, School of Medicine, Institute for Public Health, Washington University in St. Louis, St. Louis, MO, USA
Jasmin A. Tiro: Department of Clinical Sciences, University of Texas Southwestern Medical Center, Dallas, TX, USA

Medical Decision Making, 2018, vol. 38, issue 6, 673-682

Abstract: Background. “I don’t know†(DK) responses are common in health behavior research. Yet analytic approaches to managing DK responses may undermine survey validity and researchers’ ability to interpret findings. Objective. Compare the usefulness of a methodological strategy for reducing DK responses to 3 analytic approaches: 1) excluding DKs as missing data, 2) recoding them to the neutral point of the response scale, and 3) recoding DKs with the mean. Methods. We used a 4-group design to compare a methodological strategy, which encourages use of the response scale after an initial DK response, to 3 methods of analytically treating DK responses. We examined 1) whether this methodological strategy reduced the frequency of DK responses, and 2) how the methodological strategy compared to common analytic treatments in terms of factor structure and strength of correlations between measures of constructs. Results. The prompt reduced DK response frequency (55.7% of 164 unprompted participants vs. 19.6% of 102 prompted participants). Factorial invariance analyses suggested equivalence in factor loadings for all constructs throughout the groups. Compared to excluding DKs, recoding strategies and use of the prompt improved the strength of correlations between constructs, with the prompt resulting in the strongest correlations (.589 for benefits and intentions, .446 for perceived susceptibility and intentions, and .329 for benefits and perceived susceptibility). Limitations. This study was not designed a priori to test methods for addressing DK responses. Our analysis was limited to an interviewer-administered survey, and interviewers did not probe about reasons for DK responses. Conclusion. Findings suggest that use of a prompt to reduce DK responses is preferable to analytic approaches to treating DK responses. Use of such prompts may improve the validity of health behavior survey research.

Keywords: DK responses; health behavior; methodological strategy; survey data; validity (search for similar items in EconPapers)
Date: 2018
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0272989X18785159 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:medema:v:38:y:2018:i:6:p:673-682

DOI: 10.1177/0272989X18785159

Access Statistics for this article

More articles in Medical Decision Making
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:medema:v:38:y:2018:i:6:p:673-682