Recent advancements in pose estimation, activity classification, and explainable artificial intelligence (XAI) have opened new opportunities in sports analytics. However, their combined application within the domain of cricket remains largely unexplored. This paper investigates t
...
Recent advancements in pose estimation, activity classification, and explainable artificial intelligence (XAI) have opened new opportunities in sports analytics. However, their combined application within the domain of cricket remains largely unexplored. This paper investigates the integration of XAI methods to interpret black-box models trained to classify cricket shot techniques using pose keypoints extracted from video data. I implement and compare several techniques, including SHAP, Grouped SHAP, Feature Importance, Permutation Importance, LIME, Grad-CAM, and Accumulated Local Effects (ALE) plots based on their ability to generate meaningful and interpretable explanations. Through a structured experimental pipeline involving MediaPipe-based keypoint extraction, random forest (RF), convolutional neural networks (CNNs), and multiple explanation methods, I evaluate which techniques most effectively highlight the body keypoints critical to accurate shot classification. The findings indicate that SHAP outperforms other methods due to its ability to generate both local and global explanations, along with intuitive visualizations. This work contributes to the development of transparent sports AI systems and lays the foundation for future applications of interpretable machine learning in athletic coaching and skill assessment.