Investigating Affective Responses toward In-Video Pedestrian Crossing Actions using Camera and Physiological Sensors

Conference Paper (2022)
Author(s)

Shruti Rao (Centrum Wiskunde & Informatica (CWI))

Surjya Ghosh (Birla Institute of Technology and Science, Pilani)

Gerard Pons Rodriguez (Centrum Wiskunde & Informatica (CWI))

Thomas RÖggla (Centrum Wiskunde & Informatica (CWI))

Abdallah El Ali (Centrum Wiskunde & Informatica (CWI))

Pablo Cesar (Centrum Wiskunde & Informatica (CWI), TU Delft - Multimedia Computing)

Multimedia Computing
Copyright
© 2022 Shruti Rao, Surjya Ghosh, Gerard Pons Rodriguez, Thomas Röggla, Abdallah El Ali, Pablo Cesar
DOI related publication
https://doi.org/10.1145/3543174.3546842
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Shruti Rao, Surjya Ghosh, Gerard Pons Rodriguez, Thomas Röggla, Abdallah El Ali, Pablo Cesar
Multimedia Computing
Pages (from-to)
226-235
ISBN (print)
978-1-4503-9415-4
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Automatically inferring drivers' emotions during driver-pedestrian interactions to improve road safety remains a challenge for designing in-vehicle, empathic interfaces. To that end, we carried out a lab-based study using a combination of camera and physiological sensors. We collected participants' (N=21) real-time, affective (emotion self-reports, heart rate, pupil diameter, skin conductance, and facial temperatures) responses towards non-verbal, pedestrian crossing videos from the Joint Attention for Autonomous Driving (JAAD) dataset. Our findings reveal that positive, non-verbal, pedestrian crossing actions in the videos elicit higher valence ratings from participants, while non-positive actions elicit higher arousal. Different pedestrian crossing actions in the videos also have a significant influence on participants' physiological signals (heart rate, pupil diameter, skin conductance) and facial temperatures. Our findings provide a first step toward enabling in-car empathic interfaces that draw on behavioural and physiological sensing to in situ infer driver emotions during non-verbal pedestrian interactions.