Dense RGB-D-inertial SLAM with map deformations

Tristan Laidlow, Michael Bloesch, Wenbin Li, Stefan Leutenegger

Research output: Chapter in Book/Report/Conference proceedingConference contribution

19 Citations (Scopus)

Abstract

While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.

Original languageEnglish
Title of host publicationIROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems
PublisherIEEE
Pages6741-6748
Number of pages8
Volume2017-September
ISBN (Electronic)9781538626825
DOIs
Publication statusPublished - 13 Dec 2017
Event2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Canada
Duration: 24 Sep 201728 Sep 2017

Conference

Conference2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017
CountryCanada
CityVancouver
Period24/09/1728/09/17

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Cite this

Laidlow, T., Bloesch, M., Li, W., & Leutenegger, S. (2017). Dense RGB-D-inertial SLAM with map deformations. In IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems (Vol. 2017-September, pp. 6741-6748). [8206591] IEEE. https://doi.org/10.1109/IROS.2017.8206591

Dense RGB-D-inertial SLAM with map deformations. / Laidlow, Tristan; Bloesch, Michael; Li, Wenbin; Leutenegger, Stefan.

IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September IEEE, 2017. p. 6741-6748 8206591.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Laidlow, T, Bloesch, M, Li, W & Leutenegger, S 2017, Dense RGB-D-inertial SLAM with map deformations. in IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. vol. 2017-September, 8206591, IEEE, pp. 6741-6748, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Canada, 24/09/17. https://doi.org/10.1109/IROS.2017.8206591
Laidlow T, Bloesch M, Li W, Leutenegger S. Dense RGB-D-inertial SLAM with map deformations. In IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September. IEEE. 2017. p. 6741-6748. 8206591 https://doi.org/10.1109/IROS.2017.8206591
Laidlow, Tristan ; Bloesch, Michael ; Li, Wenbin ; Leutenegger, Stefan. / Dense RGB-D-inertial SLAM with map deformations. IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September IEEE, 2017. pp. 6741-6748
@inproceedings{3c5793a708b9413ba0afbdc85bf96449,
title = "Dense RGB-D-inertial SLAM with map deformations",
abstract = "While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.",
author = "Tristan Laidlow and Michael Bloesch and Wenbin Li and Stefan Leutenegger",
year = "2017",
month = "12",
day = "13",
doi = "10.1109/IROS.2017.8206591",
language = "English",
volume = "2017-September",
pages = "6741--6748",
booktitle = "IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems",
publisher = "IEEE",
address = "USA United States",

}

TY - GEN

T1 - Dense RGB-D-inertial SLAM with map deformations

AU - Laidlow, Tristan

AU - Bloesch, Michael

AU - Li, Wenbin

AU - Leutenegger, Stefan

PY - 2017/12/13

Y1 - 2017/12/13

N2 - While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.

AB - While dense visual SLAM methods are capable of estimating dense reconstructions of the environment, they suffer from a lack of robustness in their tracking step, especially when the optimisation is poorly initialised. Sparse visual SLAM systems have attained high levels of accuracy and robustness through the inclusion of inertial measurements in a tightly-coupled fusion. Inspired by this performance, we propose the first tightly-coupled dense RGB-D-inertial SLAM system. Our system has real-time capability while running on a GPU. It jointly optimises for the camera pose, velocity, IMU biases and gravity direction while building up a globally consistent, fully dense surfel-based 3D reconstruction of the environment. Through a series of experiments on both synthetic and real world datasets, we show that our dense visual-inertial SLAM system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.

UR - http://www.scopus.com/inward/record.url?scp=85041943947&partnerID=8YFLogxK

U2 - 10.1109/IROS.2017.8206591

DO - 10.1109/IROS.2017.8206591

M3 - Conference contribution

VL - 2017-September

SP - 6741

EP - 6748

BT - IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems

PB - IEEE

ER -