Pay Rates and Subject Performance in Social Science Experiments Using Crowdsourced Online Samples
David J. Andersen and
Richard R. Lau
Journal of Experimental Political Science, 2018, vol. 5, issue 3, 217-229
Abstract:
Mechanical Turk has become an important source of subjects for social science experiments, providing a low-cost alternative to the convenience of using undergraduates while avoiding the expense of drawing fully representative samples. However, we know little about how the rates we pay to “Turkers” for participating in social science experiments affects their participation. This study examines subject performance using two experiments – a short survey experiment and a longer dynamic process tracing study of political campaigns – that recruited Turkers at different rates of pay. Looking at demographics and using measures of attention, engagement and evaluation of the candidates, we find no effects of pay rates upon subject recruitment or participation. We conclude by discussing implications and ethical standards of pay.
Date: 2018
References: Add references at CitEc
Citations: View citations in EconPapers (6)
Downloads: (external link)
https://www.cambridge.org/core/product/identifier/ ... type/journal_article link to article abstract page (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:cup:jexpos:v:5:y:2018:i:03:p:217-229_00
Access Statistics for this article
More articles in Journal of Experimental Political Science from Cambridge University Press Cambridge University Press, UPH, Shaftesbury Road, Cambridge CB2 8BS UK.
Bibliographic data for series maintained by Kirk Stebbing ().