The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI
Sylvie Borau,
Tobias Otterbring,
Sandra Laporte and
Samuel Fosso Wamba
Additional contact information
Sylvie Borau: TBS - Toulouse Business School
Tobias Otterbring: UIA - University of Agder, Institute of Retail Economics
Sandra Laporte: TSE-R - Toulouse School of Economics - UT Capitole - Université Toulouse Capitole - UT - Université de Toulouse - EHESS - École des hautes études en sciences sociales - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement
Samuel Fosso Wamba: TBS - Toulouse Business School
Post-Print from HAL
Abstract:
Companies have repeatedly launched Artificial Intelligence (AI) products such as intelligent chatbots and robots with female names, voices, and bodies. Previous research posits that people intuitively favor female over male bots, mainly because female bots are judged as warmer and more likely to experience emotions. We present five online studies, including four preregistered, with a total sample of over 3,000 participants that go beyond this longstanding perception of femininity. Because warmth and experience (but not competence) are seen as fundamental qualities to be a full human but are lacking in machines, we argue that people prefer female bots because they are perceived as more human than male bots. Using implicit, subtle, and blatant scales of humanness, our results consistently show that women (Studies 1A and 1B), female bots (Studies 2 and 3), and female chatbots (Study 4) are perceived as more human than their male counterparts when compared with non-human entities (animals and machines). Study 4 investigates explicitly the acceptance of gendered algorithms operated by AI chatbots in a health context. We found that the female chatbot is preferred over the male chatbot because it is perceived as more human and more likely to consider our unique needs. These results highlight the ethical quandary faced by AI designers and policymakers: Women are said to be transformed into objects in AI, but injecting women's humanity into AI objects makes these objects seem more human and acceptable.
Keywords: Algorithm aversion; Artificial intelligence; Gender; Gendered AI; Humanness; Machine ethics; Robot; Stereotypes; Trust; Uniqueness (search for similar items in EconPapers)
Date: 2021-07
References: Add references at CitEc
Citations: View citations in EconPapers (22)
Published in Psychology and Marketing, 2021, 38 (7), pp.1052-1068. ⟨10.1002/mar.21480⟩
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hal:journl:hal-03648092
DOI: 10.1002/mar.21480
Access Statistics for this paper
More papers in Post-Print from HAL
Bibliographic data for series maintained by CCSD ().