Dense RGB-D-inertial SLAM with map deformations

Tristan Laidlow, Michael Bloesch, Wenbin Li, Stefan Leutenegger

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

47 Citations (SciVal)


While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.

Original languageEnglish
Title of host publicationIROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems
Number of pages8
ISBN (Electronic)9781538626825
Publication statusPublished - 13 Dec 2017
Event2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Canada
Duration: 24 Sept 201728 Sept 2017


Conference2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications


Dive into the research topics of 'Dense RGB-D-inertial SLAM with map deformations'. Together they form a unique fingerprint.

Cite this