Augmented reality for laparoscopic surgery

More Info
expand_more

Abstract

The transition from open surgery to laparoscopic procedures, which is motivated by the improved surgical outcomes and lower overall costs, limits however the amount of information the surgeon can have access to during surgery. To compensate for the lack of visual feedback and palpitation, which are not possible using only the laparoscope camera, research is working to generate additional intra-operative navigational aids to support the surgeon throughout complex procedures. The work presented in this thesis project wants to generate an image guided surgical system by making use of image registration between the 3D patient volumes, such as a CT scan, and the perspective of a laparoscope camera. To generate this Laparoscopic augmented reality, three main steps were implemented and tested: calibration for the camera, instrument tracking for the laparoscope camera and the registration of the 3D CT volume to the 2D pose of the camera image. The implementation for camera calibration followed Zhengyou Zhang's mathematical calibration model, adapted to an ellipsoidal calibration plate. The optical tracking for the laparoscope camera made use Philips's L2C optical tracking system, which is composed of a set of 4 cameras attached to the detector head of a C-arm Allura Clarity system, which was used to generate an intra-operative CT scan of the volume. The registration made use instead of pose estimation to combine the 3D positions of a set of optical markers placed on the skin of the patient to the 2D positions of the same markers from the camera's perspective.