Eye-based driver state monitor of distraction, drowsiness, and cognitive load for transitions of control in automated driving
Christopher D.D. Cabrall (TU Delft - OLD Intelligent Vehicles & Cognitive Robotics)
Joel Goncalves (Technische Universität München)
Alberto Morando (Chalmers University of Technology)
Matthew Sassman (Institut Francais des Sciences et Technologies des Transports (IFSTTAR))
Joost de Winter (TU Delft - OLD Intelligent Vehicles & Cognitive Robotics)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Automated driving vehicles of the future will most likely include multiple modes and levels of operation and thus include various transitions of control (ToC) between human and machine. Traditional activation devices (e.g., knobs, switches, buttons, and touchscreens) may be confused by operators among other system setting manipulators and also susceptible to inappropriate usage. Non-intrusive eye-tracking measures may assess driver states (i.e., distraction, drowsiness, and cognitive overload) automatically to trigger manual-to-automation ToC and serve as a driver readiness verification during automation-to-manual ToC. Our integrated driver state monitor is overviewed here within the scope of this brief system description/demonstration paper. It combines gaze position, gaze variability, eyelid opening, as well as external environmental complexity from the driving scene to facilitate ToC in automated driving. As both driver facing and forward facing cameras become increasingly commonplace and even legally mandated within various automated driving vehicles, our integrated system helps inform relevant future research and development towards improved human-computer interaction and driving safety.