Human-centered XAI

Developing design patterns for explanations of clinical decision support systems

Journal Article (2021)
Author(s)

Tjeerd Schoonderwoerd (TNO)

Wiard Jorritsma (TNO)

Mark Neerincx (TU Delft - Interactive Intelligence, TNO)

Karel van den van den Bosch (TNO)

Research Group
Interactive Intelligence
Copyright
© 2021 Tjeerd A.J. Schoonderwoerd, Wiard Jorritsma, M.A. Neerincx, Karel Van Den Bosch
DOI related publication
https://doi.org/10.1016/j.ijhcs.2021.102684
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Tjeerd A.J. Schoonderwoerd, Wiard Jorritsma, M.A. Neerincx, Karel Van Den Bosch
Research Group
Interactive Intelligence
Volume number
154
Pages (from-to)
1-25
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Much of the research on eXplainable Artificial Intelligence (XAI) has centered on providing transparency of machine learning models. More recently, the focus on human-centered approaches to XAI has increased. Yet, there is a lack of practical methods and examples on the integration of human factors into the development processes of AI-generated explanations that humans prove to uptake for better performance. This paper presents a case study of an application of a human-centered design approach for AI-generated explanations. The approach consists of three components: Domain analysis to define the concept & context of explanations, Requirements elicitation & assessment to derive the use cases & explanation requirements, and the consequential Multi-modal interaction design & evaluation to create a library of design patterns for explanations. In a case study, we adopt the DoReMi-approach to design explanations for a Clinical Decision Support System (CDSS) for child health. In the requirements elicitation & assessment, a user study with experienced paediatricians uncovered what explanations the CDSS should provide. In the interaction design & evaluation, a second user study tested the consequential interaction design patterns. This case study provided a first set of user requirements and design patterns for an explainable decision support system in medical diagnosis, showing how to involve expert end users in the development process and how to develop, more or less, generic solutions for general design problems in XAI.