Adaptive virtual reality based on eye-gaze behavior
More Info
expand_more
Abstract
Virtual cognitions are simulated inner thoughts, which are presented as a voice over. Previous research has shown the ability of virtual cognitions to increase the self-efficacy and knowledge of the users. When presenting such VR systems to users, having the VR system adapt to the user can improve their efficacy. For instance, by using eye-gaze tracking in order to adapt the VR scenario based on what the user is looking at. The measures of the ownership and plausibility of virtual cognitions were found to be important in previous research. The research described in this thesis has incorporated a social aspect into a gaze-adaptive VR system with virtual cognitions. A system was designed which puts the user in a social VR scenario where a dialogue between 3 virtual characters is shown. The user watches and listens from the perspective of one of the 3 characters. An experiment was carried out over consecutive days, where each day a new social scenario would be used. The scenario of that day would be shown twice. Once being gazeadaptive, and once non-adaptive. After each of the 2 viewings of the VR scenario, the participants would fill in a questionnaire to measure ownership and plausibility. The results did not show a significant difference between the gaze-adaptive and non-adaptive scenarios on ownership and plausibility. The eye-gaze is determined by using special VR-goggles, with built-in cameras, capable of measuring eye-gaze.