1. Trust in hybrid human‐automated decision‐support.
- Author
-
Kares, Felix, König, Cornelius J., Bergs, Richard, Protzel, Clea, and Langer, Markus
- Subjects
- *
TRUST , *EMPLOYEE selection , *SYNTHETIC fuels , *ARTIFICIAL intelligence - Abstract
Research has examined trust in humans and trust in automated decision support. Although reflecting a likely realization of decision support in high‐risk tasks such as personnel selection, trust in hybrid human‐automation teams has thus far received limited attention. In two experiments (N1 = 170, N2 = 154) we compare trust, trustworthiness, and trusting behavior for different types of decision‐support (automated, human, hybrid) across two assessment contexts (personnel selection, bonus payments). We additionally examined a possible trust violation by presenting one group of participants a preselection that included predominantly male candidates, thus reflecting possible unfair bias. Whereas fully‐automated decisions were trusted less, results suggest that trust in hybrid decision support was similar to trust in human‐only support. Trust violations were not perceived differently based on the type of support. We discuss theoretical (e.g., trust in hybrid support) and practical implications (e.g., keeping humans in the loop to prevent negative reactions). Practitioner points: (a)What is currently known about the topic of our study: Automated decision‐support (DS) often fueled by artificial intelligence can be perceived more negatively in selection tasks than human DSThe task context can modulate trust in automated DSDepending on the agent (human or system), trust violations can be perceived differently (b)What our Paper adds to this: In both examined contexts (selection for bonus payments; personnel selection), system DS was trusted less compared to human DSFairness issues in a decision negatively impacted trust but did not lead to different reactions based on the type of agent that produced themHybrid DS is perceived on par with human DS and better than system DS when it comes to trust (c)The implications of our study findings for practitioners: Full automation of HR related tasks can be perceived negatively, even if the task may seem suited for automationAvoid fairness issues in selection as they negatively impact trust regardless of the agent that produced themAdding a human to an automated DS (i.e., realizing hybrid DS) can alleviate negative effects on trust associated with full automation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF