Towards a modular sensor-independent active SLAM framework

Simultaneous Localization and Mapping (SLAM) is the process of simultaneous pose estimation and map creation for mobile robots. Over the years, a lot of research has gone into improving computational efficiency and robustness of SLAM algorithms. This has lead to SLAM implementations tailored to specific algorithms and sensor setups with limited reusability and modularity. The effort required to implement new or modified SLAM implementations—for instance with a different sensor set-up, algorithm, or operating environment—is therefore often large and implementation-specific optimizations often lead to sub-optimal realizations for different set-ups.

This work aims at contributing towards the widespread practical implementation of SLAM by investigating the interfaces between sensors and SLAM algorithms in order to come up with a generalized sensor back-end interface. This is done by analyzing different types of idiothetic and allothetic sensors relevant for SLAM as well as how these are used by both filter-based and graph-based SLAM algorithms. The presented interface provides insight into the influence of sensors and SLAM algorithms on the modularity of SLAM and is used to develop a proof of concept level framework that demonstrates modularity in sensor inclusion and algorithm use.

Simulations are used to demonstrate the sensor and algorithm modularity.

BlueJeans videoconference join information:

Meeting URL

Meeting ID
644 077 018

Want to dial in from a phone?
Dial one of the following numbers:
+31.20.808.2256 (Netherlands (Amsterdam))
(see all numbers -

Enter the meeting ID and passcode followed by #