Eye-based driver state monitor of distraction, drowsiness, and cognitive load for transitions of control in automated driving

Conference Paper (2016)
Author(s)

Christopher D.D. Cabrall (TU Delft - OLD Intelligent Vehicles & Cognitive Robotics)

Nico Janssen

Joel Goncalves (Technische Universität München)

Alberto Morando (Chalmers University of Technology)

Matthew Sassman (Institut Francais des Sciences et Technologies des Transports (IFSTTAR))

Joost de Winter (TU Delft - OLD Intelligent Vehicles & Cognitive Robotics)

Research Group
OLD Intelligent Vehicles & Cognitive Robotics
Copyright
© 2016 C.D.D. Cabrall, Nico Janssen, Joel Goncalves, Alberto Morando, Matthew Sassman, J.C.F. de Winter
DOI related publication
https://doi.org/10.1109/SMC.2016.7844530
More Info
expand_more
Publication Year
2016
Language
English
Copyright
© 2016 C.D.D. Cabrall, Nico Janssen, Joel Goncalves, Alberto Morando, Matthew Sassman, J.C.F. de Winter
Research Group
OLD Intelligent Vehicles & Cognitive Robotics
Pages (from-to)
1981-1982
ISBN (electronic)
978-1-5090-1897-0
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Automated driving vehicles of the future will most likely include multiple modes and levels of operation and thus include various transitions of control (ToC) between human and machine. Traditional activation devices (e.g., knobs, switches, buttons, and touchscreens) may be confused by operators among other system setting manipulators and also susceptible to inappropriate usage. Non-intrusive eye-tracking measures may assess driver states (i.e., distraction, drowsiness, and cognitive overload) automatically to trigger manual-to-automation ToC and serve as a driver readiness verification during automation-to-manual ToC. Our integrated driver state monitor is overviewed here within the scope of this brief system description/demonstration paper. It combines gaze position, gaze variability, eyelid opening, as well as external environmental complexity from the driving scene to facilitate ToC in automated driving. As both driver facing and forward facing cameras become increasingly commonplace and even legally mandated within various automated driving vehicles, our integrated system helps inform relevant future research and development towards improved human-computer interaction and driving safety.

Files

07844530.pdf
(pdf | 2.12 Mb)
- Embargo expired in 09-08-2017
License info not available