Image-Based Scene Representations for Head-Motion Parallax in 360° Panoramas

Research output: Chapter in Book/Report/Conference proceedingChapter

53 Downloads (Pure)

Abstract

Creation and delivery of “RealVR” experiences essentially consists of the following four main steps: capture, processing, representation and rendering. In this chapter, we present, compare, and discuss two recent end-to-end approaches, Parallax360 by Luo et al. [9] and MegaParallax by Bertel et al. [3]. Both propose complete pipelines for RealVR content generation and novel-view synthesis with head-motion parallax for 360° environments.

Parallax360 uses a robotic arm for capturing thousands of input views on the surface of a sphere. Based on precomputed disparity motion fields and pairwise optical flow, novel viewpoints are synthesized on the fly using flow-based blending of the nearest two to three input views which provides compelling head-motion parallax.

MegaParallax proposes a pipeline for RealVR content generation and rendering that emphasizes casual, hand-held capturing. The approach introduces view-dependent flow-based blending to enable novel-view synthesis with head-motion parallax within a viewing area determined by the field of view of the input cameras and the capturing radius.

We describe both methods and discuss their similarities and differences in corresponding steps in the RealVR pipeline and show selected results. The chapter ends by discussing advantages and disadvantages as well as outlining the most important limitations and future work.
Original languageEnglish
Title of host publicationReal VR – Immersive Digital Reality
Subtitle of host publicationHow to Import the Real World into Head-Mounted Immersive Displays
EditorsMarcus Magnor, Alexander Sorkine-Horning
PublisherSpringer International Publishing
Chapter5
Pages109-131
ISBN (Electronic)978-3-030-41816-8
ISBN (Print)978-3-030-41815-1
DOIs
Publication statusPublished - 3 Mar 2020

Publication series

NameLecture Notes in Computer Science
Volume11900

Projects

  • Cite this

    Bertel, T., Xu, F., & Richardt, C. (2020). Image-Based Scene Representations for Head-Motion Parallax in 360° Panoramas. In M. Magnor, & A. Sorkine-Horning (Eds.), Real VR – Immersive Digital Reality: How to Import the Real World into Head-Mounted Immersive Displays (pp. 109-131). (Lecture Notes in Computer Science; Vol. 11900). Springer International Publishing. https://doi.org/10.1007/978-3-030-41816-8_5