IRIS: Creating an Augmented Reality Interface for Human-Robot Communication in Urban Search and Rescue

BSc assignment

Augmented Reality (AR) is an increasingly implemented technology in Search and Rescue (S&R) applications, as it enables the display of real-time information as an overlay on the user's field of vision. This is often combined with robotics, as both ground and aerial robots can provide additional information about a scene. Current research focuses on various methods of applying this, such as sharing information between a remote team controlling the robots and an on-site team (Nalamothu et al., 2024), sharing information between multiple users (Xu et al., 2024), and recording data and additional visualisation options for ground operators (Luksas et al., 2022). However, existing approaches predominantly rely on remote operators. This research aims to make a low-infrastructure system which can transmit real-time data from the robot to the human operator while both are in the field.

For this project, the focus will lie on robot-user communication using AR. The goal is to create an AR application that displays information collected by the robot to the user, such as a real-time camera feed. In turn, the user can control the robot via this application. In this research, the robot in question will be a ground vehicle.

To achieve this, a RELbot from the Robotics and Mechatronics (RaM) group at the University of Twente will be used as a robot, along with AR glasses, most likely the Microsoft HoloLens or HoloLens 2, which can be provided by ITC. Initial research suggests that development can be done via both Unity and Unreal Engine. Testing can be done on both an emulator and a physical device. Communication between the robot and the AR application will be implemented using the Robot Operating System (ROS 2), which the RELbot already runs.