Repository hosted by TU Delft Library

Home · Contact · About · Disclaimer ·

Effects of a robotic storyteller's moody gestures on storytelling perception

Publication files not online:

Author: Xu, J. · Broekens, J. · Hindriks, K. · Neerincx, M.A.
Publisher: Institute of Electrical and Electronics Engineers Inc.
Source:2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015, 449-455
Identifier: 535456
doi: doi:10.1109/ACII.2015.7344609
ISBN: 9781479999538
Article number: 7344609
Keywords: Body Language · Human Robot Interaction · Mood Expression · Social Robots · Storytelling · Human computer interaction · Intelligent computing · Parameterization · Robotics · Robots · Behavior model · Body language · Induction process · Mood Expression · Parameterized · Social robots · Storytelling · Task executions · Human robot interaction · Human & Operational Modelling · PCS - Perceptual and Cognitive Systems · ELSS - Earth, Life and Social Sciences


A parameterized behavior model was developed for robots to show mood during task execution. In this study, we applied the model to the coverbal gestures of a robotic storyteller. This study investigated whether parameterized mood expression can 1) show mood that is changing over time; 2) reinforce affect communication when other modalities exist; 3) influence the mood induction process of the story; and 4) improve listeners' ratings of the storytelling experience and the robotic storyteller. We modulated the gestures to show either a congruent or an incongruent mood with the story mood. Results show that it is feasible to use parameterized coverbal gestures to express mood evolving over time and that participants can distinguish whether the mood expressed by the gestures is congruent or incongruent with the story mood. In terms of effects on participants we found that mood-modulated gestures (a) influence participants' mood, and (b) influence participants' ratings of the storytelling experience and the robotic storyteller. © 2015 IEEE.