Back to Search Start Over

Pathways linking expectations for AI chatbots to loyalty: A moderated mediation analysis.

Authors :
Yao, Xintong
Xi, Yipeng
Source :
Technology in Society; Sep2024, Vol. 78, pN.PAG-N.PAG, 1p
Publication Year :
2024

Abstract

Despite the prevalence of generative AI chatbots such as GPT and Bard, scholarly inquiry into how users' enduring expectations influence their engagement with AI chatbots remains scant. Drawing on expectation violation theory, the present study examines how user expectations, informed by belief in machine heuristics and concerns over human uniqueness, impact chatbot loyalty through a moderated mediation framework. A questionnaire survey of 900 participants in China revealed that belief in machine capabilities bolsters users' perceptions of machine intelligence, which in turn, enhances user loyalty. Interestingly, when users encounter service failures that challenge their expectations, their perceived intelligence of the chatbot intensifies rather than diminishes. In contrast, expectations shaped by human uniqueness concerns diminish users' perceptions of machine intelligence and, consequently, their loyalty, which remains consistent regardless of the encounter with AI failure. The study also delves into the theoretical contributions of these findings to the evolution of expectation violation theory within the sphere of human-robot interaction. • Beliefs in machine heuristics enhance user loyalty both directly and indirectly via the perceived intelligence of chatbots. • Human uniqueness concerns do not directly affect loyalty but influence it through perceived intelligence. • Negative expectation violations lead to more positive reactions than when expectations are met. • Understanding user engagement with AI chatbots requires considering the extent and nature of expectation violations. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0160791X
Volume :
78
Database :
Supplemental Index
Journal :
Technology in Society
Publication Type :
Academic Journal
Accession number :
179274215
Full Text :
https://doi.org/10.1016/j.techsoc.2024.102625