Depth Augmented Omnidirectional Stereo for 6-DoF VR Photography

Tobias Bertel, Moritz Mühlhausen, Moritz Kappel, Paul M. Bittner, Christian Richardt, Marcus Magnor

Research output: Contribution to conferencePosterpeer-review

4 Citations (SciVal)
72 Downloads (Pure)


We present an end-to-end pipeline that enables head-motion parallax for omnidirectional stereo (ODS) panoramas. Based on an ODS panorama containing a left and right eye view, our method estimates dense horizontal disparity fields between the stereo image pair. From this, we calculate a depth augmented stereo panorama (DASP) by explicitly reconstructing the scene geometry from the viewing circle corresponding to the ODS representation. The generated DASP representation supports motion parallax within the ODS viewing circle. Our approach operates directly on existing ODS panoramas. The experiments indicate the robustness and versatility of our approach on multiple real-world ODS panoramas.

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
Original languageEnglish
Number of pages2
Publication statusPublished - 22 Mar 2020
Event2020 IEEE Conference on Virtual Reality and 3D User Interfaces - online, Atlanta, USA United States
Duration: 22 Mar 202026 Mar 2020


Conference2020 IEEE Conference on Virtual Reality and 3D User Interfaces
Abbreviated titleIEEE VR
Country/TerritoryUSA United States
Internet address


Dive into the research topics of 'Depth Augmented Omnidirectional Stereo for 6-DoF VR Photography'. Together they form a unique fingerprint.

Cite this