Live User-Guided Intrinsic Video for Static Scenes

Abhimitra Meka, Gereon Fox, Michael Zollhöfer, Christian Richardt, Christian Theobalt

Research output: Contribution to journalArticlepeer-review

27 Citations (SciVal)
260 Downloads (Pure)

Abstract

We present a novel real-time approach for user-guided intrinsic decomposition of static scenes captured by an RGB-D sensor. In the first step, we acquire a three-dimensional representation of the scene using a dense volumetric reconstruction framework. The obtained reconstruction serves as a proxy to densely fuse reflectance estimates and to store user-provided constraints in three-dimensional space. User constraints, in the form of constant shading and reflectance strokes, can be placed directly on the real-world geometry using an intuitive touch-based interaction metaphor, or using interactive mouse strokes. Fusing the decomposition results and constraints in three-dimensional space allows for robust propagation of this information to novel views by re-projection. We leverage this information to improve on the decomposition quality of existing intrinsic video decomposition techniques by further constraining the ill-posed decomposition problem. In addition to improved decomposition quality, we show a variety of live augmented reality applications such as recoloring of objects, relighting of scenes and editing of material appearance.
Original languageEnglish
JournalIEEE Transactions on Visualization and Computer Graphics
DOIs
Publication statusPublished - 11 Aug 2017
EventIEEE International Symposium on Mixed and Augmented Reality - Nantes, France
Duration: 9 Oct 201713 Oct 2017
Conference number: 16
https://ismar2017.sciencesconf.org/

Keywords

  • intrinsic video decomposition
  • reflectance fusion
  • user-guided shading refinement

Fingerprint

Dive into the research topics of 'Live User-Guided Intrinsic Video for Static Scenes'. Together they form a unique fingerprint.

Cite this