Relating Human Gaze and Manual Control Behavior in Preview Tracking Tasks with Spatial Occlusion

Conference Paper (2018)
Author(s)

Evgeny Rezunenko (Student TU Delft)

Kasper van der El (TU Delft - Control & Simulation)

D. M. Pool (TU Delft - Control & Simulation)

M. M.(René) van Paassen (TU Delft - Control & Simulation)

Max Mulder (TU Delft - Control & Operations)

Research Group
Control & Simulation
Copyright
© 2018 Evgeny Rezunenko, Kasper van der El, D.M. Pool, M.M. van Paassen, Max Mulder
DOI related publication
https://doi.org/10.1109/SMC.2018.00583
More Info
expand_more
Publication Year
2018
Language
English
Copyright
© 2018 Evgeny Rezunenko, Kasper van der El, D.M. Pool, M.M. van Paassen, Max Mulder
Research Group
Control & Simulation
Pages (from-to)
3430-3435
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In manual tracking tasks with preview of the target trajectory, humans have been modeled as dual-mode “near” and “far” viewpoint controllers. This paper investigates the physical basis of these two control mechanisms, and studies whether estimated viewpoint positions represent those parts of the previewed trajectory which humans use for control. A combination of human gaze and control data is obtained, through an experiment which compared tracking with full preview (1.5 s), occluded preview, and no preview. System identification is applied to estimate the two look-ahead time parameters of a two-viewpoint preview model. Results show that humans focus their gaze often around the model’s near-viewpoint position, and seldom at the far viewpoint. Gaze measurements may augment control data for the online identification of preview control behavior, to improve personalized monitoring or shared-control systems in vehicles.

Files

License info not available