EconPapers    
Economics at your fingertips  
 

Comparing Traditional and Crowdsourcing Methods for Pretesting Survey Questions

Jennifer Edgar, Joe Murphy and Michael Keating

SAGE Open, 2016, vol. 6, issue 4, 2158244016671770

Abstract: Cognitive interviewing is a common method used to evaluate survey questions. This study compares traditional cognitive interviewing methods with crowdsourcing, or “tapping into the collective intelligence of the public to complete a task.†Crowdsourcing may provide researchers with access to a diverse pool of potential participants in a very timely and cost-efficient way. Exploratory work found that crowdsourcing participants, with self-administered data collection, may be a viable alternative, or addition, to traditional pretesting methods. Using three crowdsourcing designs (TryMyUI, Amazon Mechanical Turk, and Facebook), we compared the participant characteristics, costs, and quantity and quality of data with traditional laboratory-based cognitive interviews. Results suggest that crowdsourcing and self-administered protocols may be a viable way to collect survey pretesting information, as participants were able to complete the tasks and provide useful information; however, complex tasks may require the skills of an interviewer to administer unscripted probes.

Keywords: cognitive interviewing; pretesting; crowdsourcing; survey methodology (search for similar items in EconPapers)
Date: 2016
References: View complete reference list from CitEc
Citations: View citations in EconPapers (6)

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/2158244016671770 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:sagope:v:6:y:2016:i:4:p:2158244016671770

DOI: 10.1177/2158244016671770

Access Statistics for this article

More articles in SAGE Open
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:sagope:v:6:y:2016:i:4:p:2158244016671770