Demo Abstract: Catch My Eye

Gaze-Based Activity Recognition in an Augmented Reality Art Gallery

Conference Paper (2022)
Author(s)

Tim Scargill (Duke University)

Guohao Guohao (TU Delft - Embedded Systems)

Maria Gorlatova (Duke University)

Research Group
Embedded Systems
Copyright
© 2022 Tim Scargill, G. Lan, Maria Gorlatova
DOI related publication
https://doi.org/10.1109/IPSN54338.2022.00052
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Tim Scargill, G. Lan, Maria Gorlatova
Research Group
Embedded Systems
Pages (from-to)
503-504
ISBN (print)
978-1-6654-9625-4
ISBN (electronic)
978-1-6654-9624-7
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

The personalization of augmented reality (AR) experiences based on environmental and user context is key to unlocking their full potential. The recent addition of eye tracking to AR headsets provides a convenient method for detecting user context, but complex analysis of raw gaze data is required to detect where a user's attention and thoughts truly lie. In this demo we present Catch My Eye, the first system to incorporate deep neural network (DNN)-based activity recognition from user gaze into a realistic mobile AR app. We develop an edge computing-based architecture to offload context computation from resource-constrained AR devices, and present a working example of content adaptation based on user context, for the scenario of a virtual art gallery. It shows that user activities can be accurately recognized and employed with sufficiently low latency for practical AR applications.

Files

Demo_Abstract_Catch_My_Eye_Gaz... (pdf)
(pdf | 0.416 Mb)
- Embargo expired in 01-07-2023
License info not available