OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras

Joanna Tarko, James Tompkin, Christian Richardt

Research output: Contribution to conferencePosterpeer-review

8 Citations (SciVal)

Abstract

We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce real-world lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.
Original languageEnglish
Pages1177-1178
Number of pages2
DOIs
Publication statusPublished - 23 Mar 2019
EventIEEE Conference on Virtual Reality and 3D User Interfaces - Osaka, Japan
Duration: 23 Mar 201927 Mar 2019
Conference number: 26
http://ieeevr.org/2019/

Conference

ConferenceIEEE Conference on Virtual Reality and 3D User Interfaces
Abbreviated titleIEEE VR
Country/TerritoryJapan
CityOsaka
Period23/03/1927/03/19
Internet address

Keywords

  • 3D reconstruction
  • Environment maps
  • Lighting estimation
  • Mixed reality
  • Omnidirectional video
  • Structure from motion

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Media Technology

Fingerprint

Dive into the research topics of 'OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras'. Together they form a unique fingerprint.

Cite this