HoloMotion: Control Virtual Prosthetics via Biological Signals

MSc assignment

Introduction:
Mixed-reality (MR) devices offer unique opportunities for enhancing the user experience in fields such as prosthetics, rehabilitation, and human augmentation. By integrating MR technologies with biological signal measurements, a novel approach to human-machine interaction can be developed. This thesis project aims to create an MR device where a user—either amputee or healthy—can control a virtual 3D arm displayed through a holographic device (e.g., AR glasses), which is connected to their biological signals via surface EMG or other sensors. This virtual arm will not only act as a realistic prosthetic but could be modified to function as an additional body part, such as a sixth finger or a tail, allowing for advanced human augmentation.

Objectives:
The target of this MSc assignment is to design a deep-learning algorithm for the user to drive the virtual 3D part of the body (arm, finger, etc.) in a holographic device via the user's biological signals. Several sensors should be used to record the muscle activities and the user’s intention. The student needs to use these signals to control the virtual body part via a proposed deep-learning algorithm.

Research Questions:
The research questions guiding this project include:

  1. How to control the virtual body part in the holographic device?
  2. How to integrate the acquired biological signals, pre-process and feed into the MR system?
  3. How to design a feasible and real-time deep-learning algorithm to interpret the collected biological signals for controlling?