1. Humans in XAI: increased reliance in decision-making under uncertainty by using explanation strategies
- Author
-
Olesja Lammert, Birte Richter, Christian Schütze, Kirsten Thommes, and Britta Wrede
- Subjects
human-centered XAI ,human-computer interaction ,empirical studies in HCI ,explanation strategies ,explainability ,user reliance ,Economic theory. Demography ,HB1-3840 - Abstract
IntroductionAlthough decision support systems (DSS) that rely on artificial intelligence (AI) increasingly provide explanations to computer and data scientists about opaque features of the decision process, especially when it involves uncertainty, there is still only limited attention to making the process transparent to end users.MethodsThis paper compares four distinct explanation strategies employed by a DSS, represented by the social agent Floka, designed to assist end users in making decisions under uncertainty. Using an economic experiment with 742 participants who make lottery choices according to the Holt and Laury paradigm, we contrast two explanation strategies offering accurate information (transparent vs. guided) with two strategies prioritizing human-centered explanations (emotional vs. authoritarian) and a baseline (no explanation).Results and discussionOur findings indicate that a guided explanation strategy results in higher user reliance than a transparent strategy. Furthermore, our results suggest that user reliance is contingent on the chosen explanation strategy, and, in some instances, the absence of an explanation can also lead to increased user reliance.
- Published
- 2024
- Full Text
- View/download PDF