The strategy method conflates confusion with conditional cooperation in public goods games: evidence from large scale replications
Maxwell Burton-Chellew,
Victoire D'Amico and
Claire Guérin
No 7d5yn, SocArXiv from Center for Open Science
Abstract:
The strategy method is often used in public goods games to measure individuals’ willingness to cooperate depending on the level of cooperation by others (conditional cooperation). However, while the strategy method is informative, it risks being suggestive and inducing elevated levels of conditional cooperation that are not motivated by concerns for fairness, especially in uncertain or confused participants. Here we make 845 participants complete the strategy method two times, once with human and once with computerized groupmates. Cooperation with computers cannot rationally be motivated by concerns for fairness. Worryingly, 69% of participants conditionally cooperated with computers, whereas only 7% conditionally cooperated with humans while not cooperating with computers. Overall, 83% of participants cooperated with computers, contributing 89% as much as towards humans. Results from games with computers present a serious problem for measuring social behaviors.
Date: 2021-12-04
New Economics Papers: this item is included in nep-exp
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://osf.io/download/61aa7a32c7d9fb057feb38ed/
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:socarx:7d5yn
DOI: 10.31219/osf.io/7d5yn
Access Statistics for this paper
More papers in SocArXiv from Center for Open Science
Bibliographic data for series maintained by OSF ().