MegaParallax: 360° Panoramas with Motion Parallax

Tobias Bertel, Christian Richardt

Research output: Contribution to conferencePosterpeer-review

2 Citations (SciVal)
201 Downloads (Pure)

Abstract

Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion parallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.


This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
Original languageEnglish
DOIs
Publication statusPublished - 12 Aug 2018
EventSIGGRAPH 2018 - Vancouver, Canada
Duration: 12 Aug 201816 Aug 2018
https://s2018.siggraph.org/

Conference

ConferenceSIGGRAPH 2018
Country/TerritoryCanada
CityVancouver
Period12/08/1816/08/18
Internet address

Bibliographical note

SIGGRAPH ’18 Posters, August 12–16, 2018, Vancouver, BC, Canada
© 2018 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-5817-0/18/08.
https://doi.org/10.1145/3230744.3230793

Keywords

  • image-based rendering
  • novel-view synthesis
  • plenoptic modeling

Fingerprint

Dive into the research topics of 'MegaParallax: 360° Panoramas with Motion Parallax'. Together they form a unique fingerprint.

Cite this