Abstract
While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.
Original language | English |
---|---|
Title of host publication | IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems |
Publisher | IEEE |
Pages | 6741-6748 |
Number of pages | 8 |
Volume | 2017-September |
ISBN (Electronic) | 9781538626825 |
DOIs | |
Publication status | Published - 13 Dec 2017 |
Event | 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Canada Duration: 24 Sept 2017 → 28 Sept 2017 |
Conference
Conference | 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 24/09/17 → 28/09/17 |
Funding
The authors are with the Dyson Robotics Laboratory, Imperial College London, UK. Corresponding author: Tristan Laidlow, [email protected] Research presented in this paper has been supported by Dyson Technology Ltd.
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Computer Vision and Pattern Recognition
- Computer Science Applications