EconPapers    
Economics at your fingertips  
 

When Humanizing Customer Service Chatbots Might Backfire

Hadi Rhonda ()
Additional contact information
Hadi Rhonda: Professor of Marketing, Saïd Business School, University of Oxford, United Kingdom

NIM Marketing Intelligence Review, 2019, vol. 11, issue 2, 30-35

Abstract: More and more companies are using chatbots in customer service. Instead of with a human employee, customers interact with a machine. Many companies give these chatbots human traits through names, human-like appearances, a human voice or even character descriptions. Intuitively such a humanization strategy seems to be a good idea. Studies show, however, that the humanization of chatbots is perceived in a nuanced way and can also backfire. Especially in the context of customer complaints, human-like chatbots can intensify negative reactions of angry customers, because their performance is judged more critically compared to non-humanized chatbot variants. Service managers should therefore consider very carefully whether and in which situations they should use humanized service chatbots.

Keywords: Chatbots; Customer Service; Angry Customers; Avatars; AI (search for similar items in EconPapers)
Date: 2019
References: Add references at CitEc
Citations:

Downloads: (external link)
https://doi.org/10.2478/nimmir-2019-0013 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:vrs:gfkmir:v:11:y:2019:i:2:p:30-35:n:4

DOI: 10.2478/nimmir-2019-0013

Access Statistics for this article

NIM Marketing Intelligence Review is currently edited by Christine Kittinger-Rosanelli

More articles in NIM Marketing Intelligence Review from Sciendo
Bibliographic data for series maintained by Peter Golla ().

 
Page updated 2025-03-20
Handle: RePEc:vrs:gfkmir:v:11:y:2019:i:2:p:30-35:n:4