Do you trust your autonomous vehicle? A story of modality
Autonomous Vehicles and Trust
S. Avgousti (TU Delft - Electrical Engineering, Mathematics and Computer Science)
Myrthe L. Tielman – Mentor (TU Delft - Interactive Intelligence)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
As vehicles advance toward full autonomy, SAE Level 5 systems are typically designed and envisioned without driver controls. While this promises convenience and safety, it also raises challenges around user trust and the discomfort of having no agency. This study investigates how explanation modality (vocal vs. text) and optional control mechanisms influence trust and intervention behavior in a simulated SAE Level 5 context. Thirty-six participants completed three VR driving scenarios that varied in explanation and control design. Trust and intervention behavior were measured alongside thematic analysis of open-ended feedback. Results showed that vocal explanations increased trust more than text, though not significantly. However, the presence of control buttons significantly enhanced trust among participants who perceived them positively. These participants also intervened less often, though the effect was not statistically significant. Exploratory analyses revealed that self-reported comfort with automation was associated with higher trust and lower intervention rates. These findings challenge the SAE Level 5 assumption of no user input. Even minimal, optional control features can foster trust and reduce unnecessary interventions. The study underscores the value of designing autonomous systems that maintain transparency and user agency, supporting safer and more acceptable human-AI interaction.