Designing real-time, continuous emotion annotation techniques for 360° VR videos

Conference Paper (2020)
Author(s)

Tong Xue (Beijing Institute of Technology)

Surjya Ghosh (Centrum Wiskunde & Informatica (CWI))

Gangyi Ding (Beijing Institute of Technology)

Abdallah El El Ali (Centrum Wiskunde & Informatica (CWI))

Pablo Cesar (Centrum Wiskunde & Informatica (CWI), TU Delft - Multimedia Computing)

Multimedia Computing
DOI related publication
https://doi.org/10.1145/3334480.3382895
More Info
expand_more
Publication Year
2020
Language
English
Multimedia Computing
ISBN (electronic)
9781450368193

Abstract

With the increasing availability of head-mounted displays (HMDs) that show immersive 360° VR content, it is important to understand to what extent these immersive experiences can evoke emotions. Typically to collect emotion ground truth labels, users rate videos through post-experience self-reports that are discrete in nature. However, post-stimuli self-reports are temporally imprecise, especially after watching 360° videos. In this work, we design six continuous emotion annotation techniques for the Oculus Rift HMD aimed at minimizing workload and distraction. Based on a co-design session with six experts, we contribute HaloLight and DotSize, two continuous annotation methods deemed unobtrusive and easy to understand. We discuss the next challenges for evaluating the usability of these techniques, and reliability of continuous annotations.

No files available

Metadata only record. There are no files for this record.