Augmented Reality based digital twin to control MRI compatible robot

Finished: 2024-01-31

MSc assignment

In breast cancer screening, suspicious lesions may be found which need to be biopsied to check for malignancy. In the current gold standard, this is done manually under MR-image guidance, which requires moving the patient table out of the MRI scanner due to accessibility constraints. When doing so, deformations of the soft tissue due to e.g. breathing, muscle contractions and needle-tissue interactions significantly limit the targeting precision. Multiple biopsies may need to be extracted, resulting in significant tissue damage, while sometimes the lesion is missed altogether, requiring an additional biopsy or potentially resulting in a false negative result.

Within the RaM group at the University of Twente, an MR-compatible robotic platform has been developed to enable in-bore breast biopsies under MR image guidance. The Sunram 7 is the latest prototype with five degrees of freedom, actuated by MR-safe pneumatic stepper motors developed at RaM.

During a robotic biopsy procedure, the physician who operates the robot is in the MR control room, which makes it difficult to observe the pose of the physical robot concerning the patient. The current approach to control the robot is to acquire pre-operative images, position the robot, and then reacquire images in a back-and-forth manner which impedes effective interaction between the robot and the physician. Thus, reducing the potential of this technology.

Augmented reality (AR) technology offers new possibilities to visualize and control surgical procedures. Examples can already be found in the clinic, where pre-operative images are fused during the procedure for surgical guidance. In the current setting, AR could bring together both anatomical tracking with MRI as well as robotic manipulation by projecting a virtual representation of both the patient and the robot during MR-guided biopsy procedures.

Subsequently, AR may also link with the robot control panel to simplify the control commands into intuitive AR cues such as arrows or a trajectory.

Therefore, the goal of this MSc assignment is to develop an intuitive AR-based navigation framework for MRI-compatible robots in breast biopsy. The platform should visualize both the target as well as surrounding anatomy, as well as the robotic system.

This project will be co-supervised by Dr. Vincent Groenhuis (RaM) and Dr. Wyger Brink (Magnetic Detection & Imaging, TechMed Centre).