MegaParallax: 360° Panoramas with Motion Parallax

Research output: Contribution to conferencePoster

Abstract

Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion parallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.

Conference

ConferenceSIGGRAPH 2018
CountryCanada
CityVancouver
Period12/08/1816/08/18
Internet address

Fingerprint

Binoculars
Photography
Geometry

Keywords

  • image-based rendering
  • novel-view synthesis
  • plenoptic modeling

Cite this

MegaParallax: 360° Panoramas with Motion Parallax. / Bertel, Tobias; Richardt, Christian.

2018. Poster session presented at SIGGRAPH 2018, Vancouver, Canada.

Research output: Contribution to conferencePoster

Bertel, Tobias ; Richardt, Christian. / MegaParallax: 360° Panoramas with Motion Parallax. Poster session presented at SIGGRAPH 2018, Vancouver, Canada.
@conference{8c3f780ae68944139c1e395bfb4979a4,
title = "MegaParallax: 360° Panoramas with Motion Parallax",
abstract = "Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion parallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.",
keywords = "image-based rendering, novel-view synthesis, plenoptic modeling",
author = "Tobias Bertel and Christian Richardt",
year = "2018",
month = "8",
day = "12",
doi = "10.1145/3230744.3230793",
language = "English",
note = "SIGGRAPH 2018 ; Conference date: 12-08-2018 Through 16-08-2018",
url = "https://s2018.siggraph.org/",

}

TY - CONF

T1 - MegaParallax: 360° Panoramas with Motion Parallax

AU - Bertel, Tobias

AU - Richardt, Christian

PY - 2018/8/12

Y1 - 2018/8/12

N2 - Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion parallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.

AB - Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion parallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.

KW - image-based rendering

KW - novel-view synthesis

KW - plenoptic modeling

UR - https://richardt.name/publications/megaparallax/

U2 - 10.1145/3230744.3230793

DO - 10.1145/3230744.3230793

M3 - Poster

ER -