Immersive Virtual Environments for Upper-Limb Robotic Rehabilitation
Salvatore L. Cucinella (Erasmus MC, TU Delft - Human-Robot Interaction)
Job L.A. Mulder (Student TU Delft)
J. C.F. Winter (TU Delft - Human-Robot Interaction)
L. Marchal (Erasmus MC, TU Delft - Human-Robot Interaction)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Neuroscience evidence suggests that personalized, task-specific, high-intensity training is essential for maximizing recovery after acquired brain injury. Robotic devices combined with immersive virtual reality (VR) games, visualized through head-mounted displays (HMDs), can support such intensive training within naturalistic virtual environments with audio-visual stimuli tailored to individual needs. However, the impact of these auditory and visual demands on cognitive load remains an open question. To address this, we conducted an experiment with 22 healthy participants to explore how varying levels of visual, auditory, and cognitive demands affect users’ cognitive load and performance during a shopping task in immersive VR. We found that mental demand had the most significant impact on increasing cognitive load and hampering task performance. Visual demands, although affecting gaze behavior, did not significantly affect cognitive load or performance. Auditory demands showed small effects on cognitive load.
Files
File under embargo until 27-08-2025