Repository hosted by TU Delft Library

Home · Contact · About · Disclaimer ·
 

The design and validation of an intuitive confidence measure

Publication files not online:

Author: Waa, J.S. van der · Diggelen, J. van · Neerincx, M.A.
Type:article
Date:2018
Publisher: CEUR-WS
Source:Said, A.Komatsu, T., 2018 Joint ACM IUI Workshops, ACMIUI-WS 2018. 11 March 2018, 2068
series:
CEUR Workshop Proceedings
Identifier: 788264
doi: doi:10.1145/1235
Keywords: Certainty · Confidence · Experiment · Explainability · ICM · Instance based · Lazy learning · Machine learning · Measure · User · Validation · Experiments · Learning systems · Numerical methods · User interfaces · Certainty · Confidence · Explainability · Instance based · Lazy learning · Measure · User · Validation · Intelligent systems

Abstract

Explainable AI becomes increasingly important as the use of intelligent systems becomes more widespread in high-risk domains. In these domains it is important that the user knows to which degree the system’s decisions can be trusted. To facilitate this, we present the Intuitive Confidence Measure (ICM): A lazy learning meta-model that can predict how likely a given decision is correct. ICM is intended to be easy to understand which we validated in an experiment. We compared ICM with two different methods of computing confidence measures: The numerical output of the model and an actively learned metamodel. The validation was performed using a smart assistant for maritime professionals. Results show that ICM is easier to understand but that each user is unique in its desires for explanations. This user studies with domain experts shows what users need in their explanations and that personalization is crucial. © 2018