Magnetic Resonance Imaging (MRI)-guided breast biopsies, integral to breast cancer diagnosis, require multiple preparatory and control scans and are susceptible to human errors. Robotic interventions offer the potential to accelerate these procedures and enhance precision. Given the involvement of human subjects, operators must retain the capacity to intervene at any point. Teleoperating these robotic systems may prove non-intuitive, requiring operators to familiarize themselves with the underlying technology. The imperative for practical and intuitive interfaces to keep operators in the loop becomes apparent. The emergent technologies of Mixed Reality (MR) and Head Mounted Displays (HMD) present an immersive solution for human-robot interaction and teleoperation.
This study introduces a comprehensive Augmented Reality (AR) application, integrated with MRI capabilities. Leveraging hand gestures, voice commands, and intuitive interfaces, operators can effectively engage with the robot to perform the biopsy. Features such as auto-targeting and path planning offer crucial support, ensuring a high degree of accuracy and success rates. Initial experiments within an MRI environment show the efficacy of these features, resulting in an achieved accuracy comparable to the state of the art. Subsequent efforts are required to refine the accuracy of breast-to-robot position calibration from MRI images. Additionally, integrating real-time MRI scan feedback for needle position, as opposed to the current AR-based feedback, requires further investigation.