Enhance Real-time Surgery Planning and Navigation with Mixed Reality Technology

Nov 21, 2025

Project Milestone

The research team has achieved a key technical milestone in their real-time surgical navigation pipeline: implementing robust RGB segmentation and high-fidelity point cloud generation. Deep learning models now perform precise RGB segmentation, identifying and bounding specific anatomical features (such as the nipple) with high confidence. Additionally, the team has successfully reconstructed dense 3D point clouds of breast phantoms, capturing soft tissue geometry in 3D space as the foundation for their "digital twin" technology.

We are proud to announce that our research group has secured the MBIE Smart Ideas Grant for the project: "Enhance Real-time Surgery Planning and Navigation with Mixed Reality Technology". This research initiative, running from 2024 to 2027, aims to revolutionize breast cancer surgery by addressing the critical challenge of soft tissue deformation.

The Challenge: Navigating Deformable Tissue In current surgical practice, surgeons face significant difficulties locating tumors in soft tissues like the breast because the tissue shape changes drastically between pre-operative imaging (e.g., MRI) and the operating table. Surgeons often rely on mental approximations or invasive physical markers, which can lead to re-operations and increased patient anxiety.

Our Solution: Markerless Tracking & Point Cloud Reconstruction Our funded solution integrates Artificial Intelligence (AI) with Mixed Reality (MR) to create a system that "sees" and understands tissue changes in real time. By utilizing depth cameras, we are developing a markerless tracking system capable of generating high-fidelity 3D point clouds of the surgical site.

Technical Milestone: RGB Segmentation & Point Cloud Generation We have successfully achieved a key technical milestone in establishing our real-time pipeline: implementing robust RGB segmentation and high-fidelity point cloud generation.

  1. RGB Segmentation & AI Detection: Concurrently, our Deep Learning models are now capable of precise RGB segmentation, identifying and bounding specific anatomical features (such as the nipple) within these visual datasets with high confidence.

  2. High-Fidelity Point Cloud Generation: We have successfully reconstructed dense 3D point clouds of breast phantoms. This digital reconstruction captures the geometry of soft tissue in 3D space, serving as the foundation for our "digital twin" technology.

Future Impact: By superimposing tumor positions directly onto the patient's body via head-mounted displays, this system aims to improve surgical precision and reduce the need for re-operations. We estimate this technology could save the New Zealand healthcare system approximately $86.6 million annually while significantly improving patient outcomes.