Exploring the effect of the information amount in explanation on different gaming expertise levels

More Info
expand_more

Abstract

Explainable AI (XAI) has gained increasing attention from more and more researchers with an aim to improve human interaction with AI systems. In the context of human-agent teamwork (HAT), providing explainability to the agent helps to increase shared team knowledge and belief, therefore improving overall teamwork. With various backgrounds and characteristics of humans, expert video gamers are found to have better perception and cognitive ability. This study aims to study the effect of information amount in explanations on four factors: subjective workload, teamwork performance, trust, and explanation satisfaction in different expertise levels in human-agent teamwork. To investigate the research question, we designed a simulated search and rescue task, encompassing two types of explanations: the one containing less detailed information, and the other presenting more detailed information. After conducting the experiment with 42 participants, we first divided all participants into three expertise levels based on their self-reported game frequency and the mock task score in the tutorial. Then we statistically analyzed the effect of information amount and expertise levels on the subjective workload, team performance, trust, explanation satisfaction, and activity level. In conclusion, we did not find evidence that adapting the information amount in explanations to gaming expertise levels can yield an improvement in the user experience during simulated search and rescue tasks. However, subjective workload is found to have a negative effect on explanation satisfaction. For future studies, it may be worth investigating whether expert gamers require explanations with very detailed information in HAT.