Upper body teleoperation of a humanoid robot (avatar)

As part of the ANA XPRIZE competition, the i-Botics group develops a telerobotic system to operate the humanoid robot EVE and create the feeling for the human operator of being present in a different environment. Telerobotic systems enable controlling robotics remotely and transferring the human operator's physical and mental capabilities to a geographically different location. This graduation project aims to develop the possibility of controlling the upper body posture of the humanoid robot over distance and in real-time. The project also includes research into communicating information in social contexts in relation to the upper body posture.

After background research and analysis of suitable motion capture hardware, the robot’s motion capabilities, different motion mapping strategies, and possibilities of conveying human-like and social behaviour in Telerobotics, concepts were developed and realized. In order to map the motion, a data-driven approach using Adaptive Neuro-Fuzzy interference systems (ANFIS) and direct angle mapping using the rotational position of the human chest applied to the hip joints of the robot was developed and assessed. An Xsens suit was used for motion capture. Literature research has shown that a valuable addition to conveying human-like behaviour in Telerobotics are secondary actions, such as breathing, which is created and combined with the motion mapping algorithm.

The performance of the two motion mapping algorithms is compared based on simulation results and plots of human orientation vs. produced robot orientation. It can be concluded that the direct angle mapping approach involves less complexity and also performs slightly better than the ANFIS approach. There is, however, no visual difference. The performance of the breathing animation is tested in integration with the motion mapping algorithm and shows decent performance but has the pitfall of blocking real-time motion mapping.

As future work, it is particularly suggested to conduct user tests to evaluate the effectiveness of the breathing animation and integrate the developed motion mapping into the existing system of the i-Botics group.

To join the presentation via Microsoft Teams click here