Annotation Tool for Precise Emotion Ground Truth Label Acquisition while Watching 360° VR Videos

Conference Paper (2020)
Author(s)

Tong Xue (Beijing Institute of Technology)

Abdallah Ali (Centrum Wiskunde & Informatica (CWI))

Gangyi Ding (Beijing Institute of Technology)

Pablo Cesar (Centrum Wiskunde & Informatica (CWI), TU Delft - Multimedia Computing)

Multimedia Computing
Copyright
© 2020 Tong Xue, Abdallah El Ali, Gangyi Ding, Pablo Cesar
DOI related publication
https://doi.org/10.1109/AIVR50618.2020.00076
More Info
expand_more
Publication Year
2020
Language
English
Copyright
© 2020 Tong Xue, Abdallah El Ali, Gangyi Ding, Pablo Cesar
Multimedia Computing
Pages (from-to)
371-372
ISBN (electronic)
9781728174631
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

We demonstrate an HMD-based annotation tool for collecting precise emotion ground truth labels while users are watching 360° videos in Virtual Reality (VR). Our tool uses an HTC VIVE Pro Eye HMD for displaying 360° videos, a Joy-Con controller for inputting emotion annotations, and an Empatica E4 wristband for capturing physiological signals. Timestamps of these devices are synchronized via an NTP server. Following dimensional emotion models, users can report their emotion in terms of valence and arousal as they watch a video in VR. Annotation feedback is provided through two peripheral visualization techniques: HaloLight and DotSize. Our annotation tool provides a starting point for researchers to design momentary and continuous self-reports in virtual environments to enable fine-grained emotion recognition.

Files

AIVR2020_RCEA_360VR.pdf
(pdf | 0.346 Mb)
License info not available