Minimal Invasive Surgery(MIS) has become more prominent in recent years due to the fact that it requires less incisions compared to open surgery, thus reducing healing time and the pain associated with it. This procedure usually requires an assistant to move and control the endoscope while, the surgeon performs the operation. A head motion controlled endoscope system will allow the surgeon to move and control the endoscopic camera directly.
From the system developed previously it was noted that when the motor rotated, the output of the image remains the same showing no changes in the output hence, displaying some nonlinearities in the system. The goal of the project is to compensate for this nonlinearities that occur in the motor using a sensor fusion approach. The system consists of a Inertial Measurement Unit (IMU), which measures the surgeons head motions as an input and a Microsoft Hololens Mixed Reality headset is used for the visual output. The standard display monitor is used as additional visual output. The rotation and translation data of both the image and the motor are fused together for the development of the sensor fusion algorithm. The fused data will be validated on the system. The complete interface of this system is built in Robot Operating Software(ROS) environment.