CEAP-360VR

A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos

Journal Article (2021)
Author(s)

Tong Xue (Beijing Institute of Technology, Centrum Wiskunde & Informatica (CWI))

Abdallah El Ali (Centrum Wiskunde & Informatica (CWI))

T. Zhang (TU Delft - Multimedia Computing, Centrum Wiskunde & Informatica (CWI))

Gangyi Ding (Beijing Institute of Technology)

Pablo Cesar (TU Delft - Multimedia Computing, Centrum Wiskunde & Informatica (CWI))

Multimedia Computing
Copyright
© 2021 Tong Xue, Abdallah El Ali, T. Zhang, Gangyi Ding, Pablo Cesar
DOI related publication
https://doi.org/10.1109/TMM.2021.3124080
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Tong Xue, Abdallah El Ali, T. Zhang, Gangyi Ding, Pablo Cesar
Multimedia Computing
Bibliographical Note
Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public. @en
Volume number
25
Pages (from-to)
243-255
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and immersive experiences, where videos can evoke different emotions. Existing emotion self-report techniques within VR however are either retrospective or interrupt the immersive experience. To address this, we introduce the Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 Videos (CEAP-360VR). We conducted a controlled study (N=32) where participants used a Vive Pro Eye HMD to watch eight validated affective 360 video clips, and annotated their valence and arousal (V-A) continuously. We collected (a) behavioral (head and eye movements; pupillometry) signals (b) physiological (heart rate, skin temperature, electrodermal activity) responses (c) momentary emotion self-reports (d) within-VR discrete emotion ratings (e) motion sickness, presence, and workload. We show the consistency of continuous annotation trajectories and verify their mean V-A annotations. We find high consistency between viewed 360 video regions across subjects, with higher consistency for eye than head movements. We furthermore run baseline classification experiments, where Random Forest classifiers with 2s segments show good accuracies for subject-independent models: 66.80% (V) and 64.26% (A) for binary classification; 49.92% (V) and 52.20% (A) for 3-class classification. Our open dataset allows further experiments with continuous emotion self-reports collected in 360 VR environments, which can enable automatic assessment of immersive Quality of Experience (QoE) andmomentary affective states.

Files

CEAP_360VR_A_Continuous_Physio... (pdf)
(pdf | 6.71 Mb)
- Embargo expired in 01-07-2023
License info not available