Dense Wide-Baseline Scene Flow From Two Handheld Video Cameras

Christian Richardt, Hyeongwoo Kim, Levi Valgaerts, Christian Theobalt

Research output: Chapter or section in a book/report/conference proceedingChapter or section

21 Citations (SciVal)
112 Downloads (Pure)


We propose a new technique for computing dense scene flow from two handheld videos with wide camera baselines and different photometric properties due to different sensors or camera settings like exposure and white balance. Our technique innovates in two ways over existing methods: (1) it supports independently moving cameras, and (2) it computes dense scene flow for wide-baseline scenarios.We achieve this by combining state-of-the-art wide-baseline correspondence finding with a variational scene flow formulation. First, we compute dense, wide-baseline correspondences using DAISY descriptors for matching between cameras and over time. We then detect and replace occluded pixels in the correspondence fields using a novel edge-preserving Laplacian correspondence completion technique. We finally refine the computed correspondence fields in a variational scene flow formulation. We show dense scene flow results computed from challenging datasets with independently moving, handheld cameras of varying camera settings.
Original languageEnglish
Title of host publicationProceedings of the 2016 Fourth International Conference on 3D Vision
Subtitle of host publication25–28 October 2016 Stanford, California, USA
Place of PublicationLos Alamitos, CA, USA
Number of pages10
ISBN (Print)978-1-5090-5407-7
Publication statusPublished - 25 Oct 2016
EventInternational Conference on 3D Vision - Stanford University, Palo Alto, USA United States
Duration: 25 Oct 201628 Oct 2016


ConferenceInternational Conference on 3D Vision
Abbreviated title3DV
Country/TerritoryUSA United States
CityPalo Alto
Internet address


Dive into the research topics of 'Dense Wide-Baseline Scene Flow From Two Handheld Video Cameras'. Together they form a unique fingerprint.

Cite this