Emotion Recognition in Virtual Reality

Creation and validation of a VR-based multi-modal emotion recognition dataset

More Info
expand_more

Abstract

Emotion recognition in Virtual Reality(VR) has the potential to offer numerous benefits across various sectors such as mental healthcare, education, marketing, entertainment, etc. Although emotion recognition itself is a mature field, the sub-field of VR-based emotion recognition is still in its early stages of development. It was found that a limiting factor in the progress of this field is a lack of sufficient data for research and development of advanced deep learning models. Also, the equipment currently used to measure emotion-related signals is expensive and impractical for general usage. This thesis aims to support the progress in this field by creating a VR-based emotion recognition dataset using VR equipment only. This addresses the problem of insufficient data available for research and development, and also reduces the reliance on expensive and impractical equipment for emotion recognition.

To create a good quality dataset, several important things had to be addressed. First of all, the stimuli to evoke the emotions had to be carefully selected to ensure that genuine emotional responses were evoked and recorded in the dataset. Then, an efficient data collection system had to be created to ensure that the data collection process ran effectively, smoothly and consistently. Then, a proper labeling process had to be designed to annotate the data as accurately as possible. Finally, the compiled dataset was validated by showing that the chosen stimuli were effective in evoking the intended emotions. This was verified through the analysis of pupil response data, which is one of the recorded data modalities.