The Role of Emotion in Self-Explanations by Cognitive Agents

Conference Paper (2018)
Author(s)

Frank Kaptein (TU Delft - Interactive Intelligence)

D.J. Broekens (TU Delft - Interactive Intelligence)

K. Hindriks (TU Delft - Interactive Intelligence)

M.A. Neerincx (TNO, TU Delft - Interactive Intelligence)

Research Group
Interactive Intelligence
DOI related publication
https://doi.org/10.1109/ACIIW.2017.8272595
More Info
expand_more
Publication Year
2018
Language
English
Research Group
Interactive Intelligence
Volume number
2018-January
Pages (from-to)
88-93
ISBN (electronic)
978-1-5386-0680-3

Abstract

Artificial Intelligence (AI) systems, including intelligent agents, are becoming increasingly complex. Explainable AI (XAI) is the capability of these systems to explain their behaviour, in a for humans understandable manner. Cognitive agents, a type of intelligent agents, typically explain their actions with their beliefs and desires. However, humans also take into account their own and other's emotions in their explanations, and humans explain their emotions. We refer to using emotions in XAI as Emotion-aware eXplainable Artificial Intelligence (EXAI). Although EXAI should also include awareness of the other's emotions, in this work we focus on how the simulation of emotions in cognitive agents can help them self-explain their behaviour. We argue that emotions simulated based on cognitive appraisal theory enable (1) the explanation of these emotions, (2) using them as a heuristic to identify important beliefs and desires for the explanation, and (3) the use of emotion words in the explanations themselves.

No files available

Metadata only record. There are no files for this record.