Lilobot: A Cognitive Conversational Agent to Train Counsellors at Children’s Helplines
Design and Initial Evaluation
S.A. Grundmann (Student TU Delft)
Mohammed Al Owayyed (TU Delft - Interactive Intelligence, King Saud University)
Merijn Bruijnes (Universiteit Utrecht)
Ellen Vroonhof (Stichting de Kindertelefoon)
Willem Paul Brinkman (TU Delft - Interactive Intelligence)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
To equip new counsellors at a Dutch child helpline with the needed counselling skills, the helpline uses role-playing, a form of learning through simulation in which one counsellor-in-training portrays a child seeking help and the other portrays a counsellor. However, this process is time-intensive and logistically challenging-issues that a conversational agent could help address. In this paper, we propose an initial design for a computer agent that acts as a child help-seeker to be used in a role-play setting. Our agent, Lilobot, is based on a Belief-Desire-Intention (BDI) model to simulate the reasoning process of a child who is being bullied at school. Through interaction with Lilobot, counsellors-in-training can practise the Five Phase Model, a conversation strategy that underpins the helpline’s counselling principle of keeping conversations child-centred. We compared a training session with Lilobot to a text-based training, inviting experienced counsellors from the Dutch child helpline to participate in both sessions. We conducted pre- and post-measurement comparisons for both training sessions. Contrary to our expectations, the results show a decrease in counselling self-efficacy at post-measurement, particularly in Lilobot’s condition. Still, the counsellors’ qualitative feedback indicated that, with further development and refinements, they believed Lilobot could potentially serve as a useful supplementary tool for training new helpline counsellors. Our work also highlights three future research directions for training simulators in this domain: integrating emotions into the model, providing guided feedback to the counsellor, and incorporating Large Language Models (LLMs) into the conversations.