An Examination of the Dynamics of Crowdsourcing Contests: Role of Feedback Type
Pallab Sanyal () and
Shun Ye ()
Additional contact information
Pallab Sanyal: School of Business, George Mason University, Fairfax, Virginia 22030
Shun Ye: School of Business, George Mason University, Fairfax, Virginia 22030
Information Systems Research, 2024, vol. 35, issue 1, 394-413
Abstract:
As more businesses are turning to crowdsourcing platforms for solutions to business problems, determining how to manage the sourcing contests based on their objectives has become critically important. Existing research, both theoretical and empirical, studies the impact of a variety of contest and contestant characteristics on the outcomes of these contests. Aside from these static design parameters, a lever organizations (clients) can use to dynamically steer contests toward desirable goals is the feedback offered to the contestants (solvers) during the contest. Although a handful of recent studies focuses on the effects of feedback at a high level (e.g., volume, valence), to the best of our knowledge, none has examined the effects of the information contained in the feedback. Furthermore, the focus of the existing studies is solely on the quality of the submissions and not on other critical contest outcomes, such as the diversity of the submissions, which is found to be significant in the creativity and innovations literature. In this study, first, using the psychology literature on the theory of feedback intervention, we classify client feedback into two types: outcome and process. Second, using data from almost 12,000 design contests, we empirically examine the effects of the two types of feedback on the convergence and diversity of submissions following feedback interventions. We find that process feedback, providing goal-oriented information to solvers, fosters convergent thinking, leading to submissions that are similar. Although outcome feedback lacks the informative value of process feedback, it encourages divergent thinking, which is the ability to produce a variety of solutions to a problem. Furthermore, we find that the effects are strengthened when the feedback is provided earlier in the contest rather than later. Based on our findings, we offer insights on how practitioners can strategically use an appropriate form of feedback to either generate greater diversity of solutions or efficient convergence to an acceptable solution.
Keywords: crowdsourcing; design contests; feedback; convergence; diversity (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://dx.doi.org/10.1287/isre.2023.1232 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:orisre:v:35:y:2024:i:1:p:394-413
Access Statistics for this article
More articles in Information Systems Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().