The aim of our work is to design bodily mood expressions of humanoid robots for interactive settings that can be recognized by users and have (positive) effects on people who interact with the robots. To this end, we develop a parameterized behavior model for humanoid robots to express mood through body language. Different settings of the parameters, which control the spatial extent and motion dynamics of a behavior, result in different behavior appearances expressing different moods. In this study, we applied the behavior model to the gestures of the imitation game performed by the NAO robot to display either a positive or a negative mood. We address the question whether robot mood displayed simultaneously with the execution of functional behaviors in a task can (a) be recognized by participants and (b) produce contagion effects. Mood contagion is an automatic mechanism that induces a congruent mood state by means of the observation of another person’s emotional expression. In addition, we varied task difficulty to investigate how the task load mediates the effects. Our results show that participants are able to differentiate between positive and negative robot mood and they are able to recognize the behavioral cues (the parameters) we manipulated. Moreover, self-reported mood matches the mood expressed by the robot in the easy task condition. Additional evidence for mood contagion is provided by the fact that we were able to replicate an expected effect of negative mood on task performance: in the negative mood condition participants performed better on difficult tasks than in the positive mood condition, even though participants’ self-reported mood did not match that of the robot. © 2015, The Author(s).