Explainable Artificial Intelligence (XAI) has the potential to enhance user understanding and trust in AI systems, especially in domains where interpretability is crucial, such as cricket training. This study investigates the impact of different explanation formats on user experi
...
Explainable Artificial Intelligence (XAI) has the potential to enhance user understanding and trust in AI systems, especially in domains where interpretability is crucial, such as cricket training. This study investigates the impact of different explanation formats on user experience within a cricket-specific context. Two prototypes were developed, each including four explanation formats: textual, visual, rule-based, and mixed. The second prototype introduced interactive features to examine their influence on user experience and explanation effectiveness. A small-scale user study evaluated the explanations based on satisfaction and trust. Results show that rule-based explanations were significantly less preferred in terms of satisfaction than the other explanation formats. Furthermore, the addition of interactive features led to a significant increase in user trust, though they did not enhance satisfaction levels. These findings highlight the importance of selecting appropriate explanation formats and the potential of interactive features to enhance trust in AI-generated explanations in a cricket-specific context.