Authored

8 records found

Corrnet

Fine-grained emotion recognition for video watching using wearable physiological sensors

Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environmen ...

CEAP-360VR

A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos

Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and immersive experiences, where videos can evoke different emotions. Existing emotion self-report techniques within VR however are either retrospective or interrupt the immersive exp ...

On Fine-grained Temporal Emotion Recognition in Video

How to Trade off Recognition Accuracy with Annotation Complexity?

Fine-grained emotion recognition is the process of automatically identifying the emotions of users at a fine granularity level, typically in the time intervals of 0.5s to 4s according to the expected duration of emotions. Previous work mainly focused on developing algorithms to r ...

RCEA

Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels

Collecting accurate and precise emotion ground truth labels for mobile video watching is essential for ensuring meaningful predictions. However, video-based emotion annotation techniques either rely on post-stimulus discrete self-reports, or allow real-time, continuous emotion an ...

CorrFeat

Correlation-based feature extraction algorithm using skin conductance and pupil diameter for emotion recognition

To recognize emotions using less obtrusive wearable sensors, we present a novel emotion recognition method that uses only pupil diameter (PD) and skin conductance (SC). Psychological studies show that these two signals are related to the attention level of humans exposed to visua ...

CorrFeat

Correlation-based feature extraction algorithm using skin conductance and pupil diameter for emotion recognition

To recognize emotions using less obtrusive wearable sensors, we present a novel emotion recognition method that uses only pupil diameter (PD) and skin conductance (SC). Psychological studies show that these two signals are related to the attention level of humans exposed to visua ...
Instead of predicting just one emotion for one activity (e.g., video watching), fine-grained emotion recognition enables more temporally precise recognition. Previous works on fine-grained emotion recognition require segment-by-segment, fine-grained emotion labels to train the re ...
Fine-grained emotion recognition can model the temporal dynamics of emotions, which is more precise than predicting one emotion retrospectively for an activity (e.g., video clip watching). Previous works require large amounts of continuously annotated data to train an accurate re ...