3ET: Efficient Event-based Eye Tracking using a Change-Based ConvLSTM Network

Conference Paper (2023)
Author(s)

Qinyu Chen (Universitat Zurich, ETH Zürich)

Zuowen Wang (Universitat Zurich, ETH Zürich)

Shih Chii Liu (Universitat Zurich, ETH Zürich)

Chang Gao (TU Delft - Electronics)

Research Group
Electronics
DOI related publication
https://doi.org/10.1109/BioCAS58349.2023.10389062
More Info
expand_more
Publication Year
2023
Language
English
Research Group
Electronics
ISBN (print)
979-8-3503-0027-7
ISBN (electronic)
979-8-3503-0026-0
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This paper presents a sparse Change-Based Convolutional Long Short-Term Memory (CB-ConvLSTM) model for event-based eye tracking, key for next-generation wearable healthcare technology such as AR/VR headsets. We leverage the benefits of retina-inspired event cameras, namely their low-latency response and sparse output event stream, over traditional frame-based cameras. Our CB-ConvLSTM architecture efficiently extracts spatio-temporal features for pupil tracking from the event stream, outperforming conventional CNN structures. Utilizing a delta-encoded recurrent path enhancing activation sparsity, CB-ConvLSTM reduces arithmetic operations by approximately 4.7× without losing accuracy when tested on a v2e-generated event dataset of labeled pupils. This increase in efficiency makes it ideal for real-time eye tracking in resource-constrained devices. The project code and dataset are openly available at https://github.com/qinche106/cb-convlstm-eyetracking.

Files

3ET_Efficient_Event-based_Eye_... (pdf)
(pdf | 2.71 Mb)
- Embargo expired in 18-07-2024
License info not available