InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset

Wenbin Li, Sajad Saeedi, John McCormac, Ronald Clark, Dimos Tzoumanikas, Qing Ye, Yuzhong Huang, Rui Tang, Stefan Leutenegger

Research output: Contribution to conferencePaper

26 Downloads (Pure)

Abstract

Datasets have gained an enormous amount of popularity in the computer vision community, from training and evaluation of Deep Learning-based methods to benchmarking Simultaneous Localization and Mapping (SLAM). Without a doubt, synthetic imagery bears a vast potential due to scalability in terms of amounts of data obtainable without tedious manual ground truth annotations or measurements. Here, we present a dataset with the aim of providing a higher degree of photo-realism, larger scale, more variability as well as serving a wider range of purposes compared to existing datasets. Our dataset leverages the availability of millions of professional interior designs and millions of production-level furniture and object assets -- all coming with fine geometric details and high-resolution texture. We render high-resolution and high frame-rate video sequences following realistic trajectories while supporting various camera types as well as providing inertial measurements. Together with the release of the dataset, we will make executable program of our interactive simulator software as well as our renderer available at https://interiornetdataset.github.io. To showcase the usability and uniqueness of our dataset, we show benchmarking results of both sparse and dense SLAM algorithms.
Original languageEnglish
Publication statusPublished - 6 Sep 2018
Event29th British Machine Vision Conference 2018 - Northumbria University, Newcastle
Duration: 3 Sep 20186 Sep 2018
http://bmvc2018.org/

Conference

Conference29th British Machine Vision Conference 2018
Abbreviated titleBMVC 2018
CityNewcastle
Period3/09/186/09/18
Internet address

Cite this

Li, W., Saeedi, S., McCormac, J., Clark, R., Tzoumanikas, D., Ye, Q., ... Leutenegger, S. (2018). InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset. Paper presented at 29th British Machine Vision Conference 2018, Newcastle, .

InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset. / Li, Wenbin; Saeedi, Sajad; McCormac, John; Clark, Ronald; Tzoumanikas, Dimos; Ye, Qing; Huang, Yuzhong; Tang, Rui; Leutenegger, Stefan.

2018. Paper presented at 29th British Machine Vision Conference 2018, Newcastle, .

Research output: Contribution to conferencePaper

Li, W, Saeedi, S, McCormac, J, Clark, R, Tzoumanikas, D, Ye, Q, Huang, Y, Tang, R & Leutenegger, S 2018, 'InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset' Paper presented at 29th British Machine Vision Conference 2018, Newcastle, 3/09/18 - 6/09/18, .
Li W, Saeedi S, McCormac J, Clark R, Tzoumanikas D, Ye Q et al. InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset. 2018. Paper presented at 29th British Machine Vision Conference 2018, Newcastle, .
Li, Wenbin ; Saeedi, Sajad ; McCormac, John ; Clark, Ronald ; Tzoumanikas, Dimos ; Ye, Qing ; Huang, Yuzhong ; Tang, Rui ; Leutenegger, Stefan. / InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset. Paper presented at 29th British Machine Vision Conference 2018, Newcastle, .
@conference{05c869ea074b4bc3b4a4ccf4c2692f83,
title = "InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset",
abstract = "Datasets have gained an enormous amount of popularity in the computer vision community, from training and evaluation of Deep Learning-based methods to benchmarking Simultaneous Localization and Mapping (SLAM). Without a doubt, synthetic imagery bears a vast potential due to scalability in terms of amounts of data obtainable without tedious manual ground truth annotations or measurements. Here, we present a dataset with the aim of providing a higher degree of photo-realism, larger scale, more variability as well as serving a wider range of purposes compared to existing datasets. Our dataset leverages the availability of millions of professional interior designs and millions of production-level furniture and object assets -- all coming with fine geometric details and high-resolution texture. We render high-resolution and high frame-rate video sequences following realistic trajectories while supporting various camera types as well as providing inertial measurements. Together with the release of the dataset, we will make executable program of our interactive simulator software as well as our renderer available at https://interiornetdataset.github.io. To showcase the usability and uniqueness of our dataset, we show benchmarking results of both sparse and dense SLAM algorithms.",
author = "Wenbin Li and Sajad Saeedi and John McCormac and Ronald Clark and Dimos Tzoumanikas and Qing Ye and Yuzhong Huang and Rui Tang and Stefan Leutenegger",
year = "2018",
month = "9",
day = "6",
language = "English",
note = "29th British Machine Vision Conference 2018, BMVC 2018 ; Conference date: 03-09-2018 Through 06-09-2018",
url = "http://bmvc2018.org/",

}

TY - CONF

T1 - InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset

AU - Li, Wenbin

AU - Saeedi, Sajad

AU - McCormac, John

AU - Clark, Ronald

AU - Tzoumanikas, Dimos

AU - Ye, Qing

AU - Huang, Yuzhong

AU - Tang, Rui

AU - Leutenegger, Stefan

PY - 2018/9/6

Y1 - 2018/9/6

N2 - Datasets have gained an enormous amount of popularity in the computer vision community, from training and evaluation of Deep Learning-based methods to benchmarking Simultaneous Localization and Mapping (SLAM). Without a doubt, synthetic imagery bears a vast potential due to scalability in terms of amounts of data obtainable without tedious manual ground truth annotations or measurements. Here, we present a dataset with the aim of providing a higher degree of photo-realism, larger scale, more variability as well as serving a wider range of purposes compared to existing datasets. Our dataset leverages the availability of millions of professional interior designs and millions of production-level furniture and object assets -- all coming with fine geometric details and high-resolution texture. We render high-resolution and high frame-rate video sequences following realistic trajectories while supporting various camera types as well as providing inertial measurements. Together with the release of the dataset, we will make executable program of our interactive simulator software as well as our renderer available at https://interiornetdataset.github.io. To showcase the usability and uniqueness of our dataset, we show benchmarking results of both sparse and dense SLAM algorithms.

AB - Datasets have gained an enormous amount of popularity in the computer vision community, from training and evaluation of Deep Learning-based methods to benchmarking Simultaneous Localization and Mapping (SLAM). Without a doubt, synthetic imagery bears a vast potential due to scalability in terms of amounts of data obtainable without tedious manual ground truth annotations or measurements. Here, we present a dataset with the aim of providing a higher degree of photo-realism, larger scale, more variability as well as serving a wider range of purposes compared to existing datasets. Our dataset leverages the availability of millions of professional interior designs and millions of production-level furniture and object assets -- all coming with fine geometric details and high-resolution texture. We render high-resolution and high frame-rate video sequences following realistic trajectories while supporting various camera types as well as providing inertial measurements. Together with the release of the dataset, we will make executable program of our interactive simulator software as well as our renderer available at https://interiornetdataset.github.io. To showcase the usability and uniqueness of our dataset, we show benchmarking results of both sparse and dense SLAM algorithms.

M3 - Paper

ER -