Back to Search Start Over

Does a lack of emotions make chatbots unfit to be psychotherapists?

Authors :
Rahsepar Meadi M
Bernstein JS
Batelaan N
van Balkom AJLM
Metselaar S
Source :
Bioethics [Bioethics] 2024 Jul; Vol. 38 (6), pp. 503-510. Date of Electronic Publication: 2024 May 12.
Publication Year :
2024

Abstract

Mental health chatbots (MHCBs) designed to support individuals in coping with mental health issues are rapidly advancing. Currently, these MHCBs are predominantly used in commercial rather than clinical contexts, but this might change soon. The question is whether this use is ethically desirable. This paper addresses a critical yet understudied concern: assuming that MHCBs cannot have genuine emotions, how this assumption may affect psychotherapy, and consequently the quality of treatment outcomes. We argue that if MHCBs lack emotions, they cannot have genuine (affective) empathy or utilise countertransference. Consequently, this gives reason to worry that MHCBs are (a) more liable to harm and (b) less likely to benefit patients than human therapists. We discuss some responses to this worry and conclude that further empirical research is necessary to determine whether these worries are valid. We conclude that, even if these worries are valid, it does not mean that we should never use MHCBs. By discussing the broader ethical debate on the clinical use of chatbots, we point towards how further research can help us establish ethical boundaries for how we should use mental health chatbots.<br /> (© 2024 The Authors. Bioethics published by John Wiley & Sons Ltd.)

Details

Language :
English
ISSN :
1467-8519
Volume :
38
Issue :
6
Database :
MEDLINE
Journal :
Bioethics
Publication Type :
Academic Journal
Accession number :
38735049
Full Text :
https://doi.org/10.1111/bioe.13299