AI-Powered Tumor Detection, Segmentation & 3D Visualization on Intraoperative Ultrasound in Robotic Colorectal Cancer Surgery

Master Thesis (2025)
Author(s)

M.S. Wieringa (TU Delft - Mechanical Engineering)

Contributor(s)

Freija Geldof – Mentor (Nederlands Kanker Instituut - Antoni van Leeuwenhoek ziekenhuis)

Jifke Veenland – Mentor (Erasmus MC)

Faculty
Mechanical Engineering
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
11-09-2025
Awarding Institution
Delft University of Technology
Programme
['Technical Medicine']
Faculty
Mechanical Engineering
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Introduction
Surgical resection for colorectal cancer (CRC) requires achieving an adequate distal resection margin (DRM). This goal is challenging in robot-assisted surgery due to the lack of tactile feedback and a limited field of view. Intraoperative ultrasound (ioUS) provides real-time imaging but can be difficult to interpret. This thesis developed and evaluated an AI-powered pipeline for automatic tumor localization on ioUS to support surgeons in DRM assessment. A parallel study assessed the feasibility of 3D tumor reconstruction from mechanically tracked ex vivo sweeps.
Methods
A dataset of over 16,000 US frames was collected and annotated, from 31 CRC patients, both in vivo during robotic surgery and ex vivo from resected specimens. Two deep learning architectures were trained and compared: a task-specific segmentation model (YOLO11-seg) and an adapted foundation model (PLM). Performance was evaluated using metrics for detection (sensitivity, specificity) and localization (Dice/IoU). For the 3D feasibility study, a Haply Inverse3 device was used to track the US probe during an ex vivo scan, allowing for volumetric reconstruction.
Results
The final YOLO11-seg model achieved a Dice score of 0.75, with a sensitivity of 0.74, a specificity of 0.93, and proved highly responsive with a median detection latency of only 1.0 frame (100 ms) over all recorded validation sweeps. The PLM model achieved an IoU of 0.58, a sensitivity of 0.75, and a specificity of 0.98. A key finding was that the primary performance limitation, a slight deficit in sensitivity, was systematically concentrated in a small subset of visually challenging outlier cases. The 3D reconstruction workflow was successfully validated, with a reconstructed tumor length comparable to pathological results.
Conclusion
This thesis demonstrated that real-time, AI-powered assistance for tumor localization in robotic CRC surgery is a feasible objective. The developed models provide a reliable “second opinion” by achieving a functional balance: a sensitivity of 0.75 is sufficient for confirmatory use, while high specificity ensures alerts are trustworthy. With a Dice score approaching the lower bound of reported human expert agreement, this work establishes a practical 2D guidance tool. To our knowledge, this is the first study to collect intraoperative US data for this purpose and the first to develop an AI-based system for real-time DRM assessment in robotic CRC surgery. The path forward requires a focus on increasing dataset diversity to further improve sensitivity and refining the 3D tracking hardware to enable future applications.

Files

License info not available
warning

File under embargo until 07-09-2026