Live User-Guided Intrinsic Video for Static Scenes

Abhimitra Meka, Gereon Fox, Michael Zollhöfer, Christian Richardt, Christian Theobalt

Research output: Contribution to journalArticle

4 Citations (Scopus)
70 Downloads (Pure)

Abstract

We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection. We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.
Original languageEnglish
JournalIEEE Transactions on Visualization and Computer Graphics
DOIs
Publication statusPublished - 11 Aug 2017
EventIEEE International Symposium on Mixed and Augmented Reality - Nantes, France
Duration: 9 Oct 201713 Oct 2017
Conference number: 16
https://ismar2017.sciencesconf.org/

Fingerprint

Decomposition
Augmented reality
Electric fuses
Geometry
Sensors

Keywords

  • intrinsic video decomposition
  • reflectance fusion
  • user-guided shading refinement

Cite this

Live User-Guided Intrinsic Video for Static Scenes. / Meka, Abhimitra; Fox, Gereon; Zollhöfer, Michael; Richardt, Christian; Theobalt, Christian.

In: IEEE Transactions on Visualization and Computer Graphics, 11.08.2017.

Research output: Contribution to journalArticle

Meka, Abhimitra ; Fox, Gereon ; Zollhöfer, Michael ; Richardt, Christian ; Theobalt, Christian. / Live User-Guided Intrinsic Video for Static Scenes. In: IEEE Transactions on Visualization and Computer Graphics. 2017.
@article{2c95a67b5e204564bf92ab3e70d25939,
title = "Live User-Guided Intrinsic Video for Static Scenes",
abstract = "We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection. We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.",
keywords = "intrinsic video decomposition, reflectance fusion, user-guided shading refinement",
author = "Abhimitra Meka and Gereon Fox and Michael Zollh{\"o}fer and Christian Richardt and Christian Theobalt",
year = "2017",
month = "8",
day = "11",
doi = "10.1109/TVCG.2017.2734425",
language = "English",
journal = "IEEE Transactions on Visualization and Computer Graphics",
issn = "1077-2626",
publisher = "IEEE",

}

TY - JOUR

T1 - Live User-Guided Intrinsic Video for Static Scenes

AU - Meka, Abhimitra

AU - Fox, Gereon

AU - Zollhöfer, Michael

AU - Richardt, Christian

AU - Theobalt, Christian

PY - 2017/8/11

Y1 - 2017/8/11

N2 - We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection. We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.

AB - We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection. We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.

KW - intrinsic video decomposition

KW - reflectance fusion

KW - user-guided shading refinement

UR - https://doi.org/10.1109/TVCG.2017.2734425

UR - https://gvv.mpi-inf.mpg.de/projects/InteractiveIntrinsicAR/

U2 - 10.1109/TVCG.2017.2734425

DO - 10.1109/TVCG.2017.2734425

M3 - Article

JO - IEEE Transactions on Visualization and Computer Graphics

JF - IEEE Transactions on Visualization and Computer Graphics

SN - 1077-2626

ER -