Salience Bias in Crowdsourcing Contests
Ho Cheung Brian Lee,
Sulin Ba,
Xinxin Li and
Jan Stallaert
Additional contact information
Ho Cheung Brian Lee: Manning School of Business, University of Massachusetts Lowell, Lowell, Massachusetts 01854
Sulin Ba: School of Business, University of Connecticut, Storrs, Connecticut 06269
Xinxin Li: School of Business, University of Connecticut, Storrs, Connecticut 06269
Jan Stallaert: School of Business, University of Connecticut, Storrs, Connecticut 06269
Information Systems Research, 2018, vol. 29, issue 2, 401-418
Abstract:
Crowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform’s designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platform Kaggle, combined with survey data from actual Kaggle contest participants, we examine the role of a systematic bias, namely, the salience bias, in influencing the performance of the crowdsourcing workers and how the number of crowdsourcing workers moderates the impact of the salience bias on the outcomes of contests. Our results suggest that the salience bias influences the performance of contestants, including the winners of the contests. Furthermore, the number of participating contestants may attenuate or amplify the impact of the salience bias on the outcomes of contests, depending on the effort required to complete the tasks. Our results have critical implications for crowdsourcing firms and platform designers. The online appendix is available at https://doi.org/10.1287/isre.2018.0775 .
Keywords: behavioral economics; crowdsourcing; open innovation; salience bias; parallel path effect; competition effect (search for similar items in EconPapers)
Date: 2018
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (7)
Downloads: (external link)
https://doi.org/isre.2018.0775 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:orisre:v:29:y:2018:i:2:p:401-418
Access Statistics for this article
More articles in Information Systems Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().