Presenting XAI-generated Explanations Of Cricket Shots
G. Vitner (TU Delft - Electrical Engineering, Mathematics and Computer Science)
Ujwal Gadiraju – Mentor (TU Delft - Web Information Systems)
D. Zhan – Mentor (TU Delft - Web Information Systems)
Mark Neerincx – Graduation committee member (TU Delft - Interactive Intelligence)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Explainable Artificial Intelligence (XAI) has the potential to enhance user understanding and trust in AI systems, especially in domains where interpretability is crucial, such as cricket training. This study investigates the impact of different explanation formats on user experience within a cricket-specific context. Two prototypes were developed, each including four explanation formats: textual, visual, rule-based, and mixed. The second prototype introduced interactive features to examine their influence on user experience and explanation effectiveness. A small-scale user study evaluated the explanations based on satisfaction and trust. Results show that rule-based explanations were significantly less preferred in terms of satisfaction than the other explanation formats. Furthermore, the addition of interactive features led to a significant increase in user trust, though they did not enhance satisfaction levels. These findings highlight the importance of selecting appropriate explanation formats and the potential of interactive features to enhance trust in AI-generated explanations in a cricket-specific context.