Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax

Bicheng Luo, Feng Xu, Christian Richardt, Jun-Hai Yong

Research output: Contribution to journalArticle

  • 2 Citations

Abstract

We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience – and hence sense of immersion – achieved with our approach compared to widely-used stereoscopic panoramas.

Fingerprint

Virtual reality
Robotics
Cameras
Display devices

Keywords

  • 360° scene capture
  • scene representation
  • head-motion parallax
  • 6 degrees-of-freedom (6-DoF)
  • image-based rendering

Cite this

Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax. / Luo, Bicheng; Xu, Feng; Richardt, Christian; Yong, Jun-Hai.

In: IEEE Transactions on Visualization and Computer Graphics, Vol. 24, No. 4, 17.01.2018, p. 1545-1553.

Research output: Contribution to journalArticle

@article{f509e898b8e7423c803aa5463101456b,
title = "Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax",
abstract = "We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience – and hence sense of immersion – achieved with our approach compared to widely-used stereoscopic panoramas.",
keywords = "360° scene capture, scene representation, head-motion parallax, 6 degrees-of-freedom (6-DoF), image-based rendering",
author = "Bicheng Luo and Feng Xu and Christian Richardt and Jun-Hai Yong",
year = "2018",
month = "1",
day = "17",
doi = "10.1109/TVCG.2018.2794071",
language = "English",
volume = "24",
pages = "1545--1553",
journal = "IEEE Transactions on Visualization and Computer Graphics",
issn = "1077-2626",
publisher = "IEEE",
number = "4",

}

TY - JOUR

T1 - Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax

AU - Luo, Bicheng

AU - Xu, Feng

AU - Richardt, Christian

AU - Yong, Jun-Hai

PY - 2018/1/17

Y1 - 2018/1/17

N2 - We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience – and hence sense of immersion – achieved with our approach compared to widely-used stereoscopic panoramas.

AB - We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts. Based on our scene representation, we present an end-to-end system that captures real scenes with a robotic camera arm, processes the recorded data, and finally renders the scene in a head-mounted display in real time (more than 40 Hz). Our approach is the first to support head-motion parallax when viewing real 360° scenes. We demonstrate compelling results that illustrate the enhanced visual experience – and hence sense of immersion – achieved with our approach compared to widely-used stereoscopic panoramas.

KW - 360° scene capture

KW - scene representation

KW - head-motion parallax

KW - 6 degrees-of-freedom (6-DoF)

KW - image-based rendering

U2 - 10.1109/TVCG.2018.2794071

DO - 10.1109/TVCG.2018.2794071

M3 - Article

VL - 24

SP - 1545

EP - 1553

JO - IEEE Transactions on Visualization and Computer Graphics

T2 - IEEE Transactions on Visualization and Computer Graphics

JF - IEEE Transactions on Visualization and Computer Graphics

SN - 1077-2626

IS - 4

ER -