Exploring the dynamics of user experience and interaction in XR-enhanced robotic surgery
A systematic review
Yaning Li (Xi’an Jiaotong University, Politecnico di Milano)
Meng Li (Xi’an Jiaotong University)
Shucheng Zheng (Politecnico di Milano, Xi’an Jiaotong University)
Luxi Yang (Xi’an Jiaotong University, Politecnico di Milano)
Lanqing Peng (Xi’an Jiaotong University)
Chiyang Fu (Xi’an Jiaotong University)
Yuexi Chen (Politecnico di Milano, Xi’an Jiaotong University)
Chenxi Wang (Xi’an Jiaotong University, Politecnico di Milano)
D.J. van Eijk (TU Delft - Human Factors)
undefined More Authors (External organisation)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Robotic surgery, also known as robotic-assisted surgery (RAS), has rapidly evolved during the last decade. RAS systems are developed to assist surgeons to perform complex minimally invasive surgeries, and necessitate augmented interfaces for precise execution of these image-guided procedures. Extended Reality (XR) technologies, augmenting the real-world perception via integrating digital contents, show promise in enhancing RAS efficacy in various studies. Despite multiple reviews on technological and medical aspects, the crucial elements of human-robot interaction (HRI) and user experience (UX) remain underexplored. This review fills this gap by elucidating HRI dynamics within XR-aided RAS systems, emphasizing their impact on UX and overall surgical outcomes. By synthesizing existing literature, this systematic review study identifies challenges and opportunities, paving the way for improved XR-enhanced robotic surgery, ultimately enhancing patient care and surgical performance.