Driver response times to auditory, visual, and tactile take-over requests

A simulator study with 101participants

Conference Paper (2017)
Authors

Sebastiaan M. Petermeijer (Technische Universität München)

Fabian Doubek (Technische Universität München)

J. C F De Winter (TU Delft - Biomechatronics & Human-Machine Control)

Research Group
Biomechatronics & Human-Machine Control
Copyright
© 2017 S.M. Petermeijer, Fabian Doubek, J.C.F. de Winter
To reference this document use:
https://doi.org/10.1109/SMC.2017.8122827
More Info
expand_more
Publication Year
2017
Language
English
Copyright
© 2017 S.M. Petermeijer, Fabian Doubek, J.C.F. de Winter
Research Group
Biomechatronics & Human-Machine Control
Pages (from-to)
1505-1510
ISBN (print)
978-1-5386-1645-1
DOI:
https://doi.org/10.1109/SMC.2017.8122827
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Conditionally automated driving systems may soon be available on the market. Even though these systems exempt drivers from the driving task for extended periods of time, drivers are expected to take back control when the automation issues a so-called take-over request. This study investigated the interaction between take-over request modality and type of non-driving task, regarding the driver's reaction time. It was hypothesized that reaction times are higher when the non-driving task and the take-over request use the same modality. For example, auditory take-over requests were expected to be relatively ineffective in situations in which the driver is making a phone call. 101 participants, divided into three groups, performed one of three non-driving tasks, namely reading (i.e., visual task), calling (auditory task), or watching a video (visual/auditory task). Results showed that auditory and tactile take-over requests yielded overall faster reactions than visual take-over requests. The expected interaction between takeover modality and the dominant modality of the non-driving task was not found. As for self-reported usefulness, auditory and tactile take-over requests yielded higher scores than visual ones. In conclusion, it seems that auditory and tactile stimuli are equally effective as take-over requests, regardless of the non-driving task. Further study into the effects of realistic non-driving tasks is needed to identify which non-driving tasks are detrimental to safety in automated driving.

Files

08122827.pdf
(pdf | 0.666 Mb)
- Embargo expired in 01-06-2018
License info not available