EyeSyn

Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

Conference Paper (2022)
Author(s)

Guohao Guohao (TU Delft - Embedded Systems)

Tim Scargill (Duke University)

Maria Gorlatova (Duke University)

Research Group
Embedded Systems
Copyright
© 2022 G. Lan, Tim Scargill, Maria Gorlatova
DOI related publication
https://doi.org/10.1109/IPSN54338.2022.00026
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 G. Lan, Tim Scargill, Maria Gorlatova
Research Group
Embedded Systems
Bibliographical Note
Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.@en
Pages (from-to)
233-246
ISBN (print)
978-1-6654-9625-4
ISBN (electronic)
978-1-6654-9624-7
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Recent advances in eye tracking have given birth to a new genre of gaze-based context sensing applications, ranging from cognitive load estimation to emotion recognition. To achieve state-of-the-art recognition accuracy, a large-scale, labeled eye movement dataset is needed to train deep learning-based classifiers. However, due to the heterogeneity in human visual behavior, as well as the labor-intensive and privacy-compromising data collection process, datasets for gaze-based activity recognition are scarce and hard to collect. To alleviate the sparse gaze data problem, we present EyeSyn, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset. Taking gaze-based museum activity recognition as a case study, our evaluation demonstrates that EyeSyn can not only replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device, but also simulate the signal diversity that results from dif-ferent measurement setups and subject heterogeneity. Moreover, in the few-shot learning scenario, EyeSyn can be readily incorpo-rated with either transfer learning or meta-learning to achieve 90% accuracy, without the need for a large-scale dataset for training.

Files

EyeSyn_Psychology_inspired_Eye... (pdf)
(pdf | 1.28 Mb)
- Embargo expired in 01-07-2023
License info not available