Touchless interaction has recently gained considerable attention by researchers as well as industry. Different domains are interested in implementing this technology in their solutions. Medical visualization has a special interest in this technology due to the sterile conditions
...
Touchless interaction has recently gained considerable attention by researchers as well as industry. Different domains are interested in implementing this technology in their solutions. Medical visualization has a special interest in this technology due to the sterile conditions in operating rooms. Exploration and detailed inspection of the scanned objects are among the most common interactions performed by professionals. These operations become more challenging when combined with touchless input. Context-aware methods exist, which facilitate navigation, but these methods are made for meshes and not for volume renderings. Hence the research question: Can these methods be extended to volume renderings and how well will they perform with touchless interaction metaphors? Metaphor and underlying VolCam algorithm are presented in this work. The metaphor allows users to perform exploration and inspection tasks on medical volume data using touchless input device - LeapMotion. The VolCam - an extension of the ShellCam algorithm, automatically maps the user input to distinct camera movements based on the current scene view by sampling the visible part of the volume. Interactive frame rates are achieved by performing computations on GPU. No pre-processing or specialized data structures are required which makes the technique directly applicable to wide-range of volume datasets.