EconPapers    
Economics at your fingertips  
 

Crowdsourcing Analysis of Twitter Data on Climate Change: Paid Workers vs. Volunteers

Andrei P. Kirilenko, Travis Desell, Hany Kim and Svetlana Stepchenkova
Additional contact information
Andrei P. Kirilenko: The Department of Tourism, Recreation and Sport Management, University of Florida, P.O. Box 118208, Gainesville, FL 32611-8208, USA
Travis Desell: The Department of Computer Science, University of North Dakota, Streibel Hall, 3950 Campus Road Stop 9015, Grand Forks, ND 58202-9015, USA
Hany Kim: The Department of Business Administration and Tourism and Hospitality Management, Mount Saint Vincent University, 166 Bedford Highway, Halifax, NS B3M 2J6, Canada
Svetlana Stepchenkova: The Department of Tourism, Recreation and Sport Management, University of Florida, P.O. Box 118208, Gainesville, FL 32611-8208, USA

Sustainability, 2017, vol. 9, issue 11, 1-15

Abstract: Web based crowdsourcing has become an important method of environmental data processing. Two alternatives are widely used today by researchers in various fields: paid data processing mediated by for-profit businesses such as Amazon’s Mechanical Turk, and volunteer data processing conducted by amateur citizen-scientists. While the first option delivers results much faster, it is not quite clear how it compares with volunteer processing in terms of quality. This study compares volunteer and paid processing of social media data originating from climate change discussions on Twitter. The same sample of Twitter messages discussing climate change was offered for processing to the volunteer workers through the Climate Tweet project, and to the paid workers through the Amazon MTurk platform. We found that paid crowdsourcing required the employment of a high redundancy data processing design to obtain quality that was comparable with volunteered processing. Among the methods applied to improve data processing accuracy, limiting the geographical locations of the paid workers appeared the most productive. Conversely, we did not find significant geographical differences in the accuracy of data processed by volunteer workers. We suggest that the main driver of the found pattern is the differences in familiarity of the paid workers with the research topic.

Keywords: citizen-scientist; climate change; crowdsourcing; MTurk; social networks; Twitter (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2017
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)

Downloads: (external link)
https://www.mdpi.com/2071-1050/9/11/2019/pdf (application/pdf)
https://www.mdpi.com/2071-1050/9/11/2019/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:9:y:2017:i:11:p:2019-:d:117578

Access Statistics for this article

Sustainability is currently edited by Ms. Alexandra Wu

More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-24
Handle: RePEc:gam:jsusta:v:9:y:2017:i:11:p:2019-:d:117578