OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras

Research output: Contribution to conferencePoster

Abstract

We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce real-world lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.

Conference

ConferenceIEEE Conference on Virtual Reality and 3D User Interfaces
Abbreviated titleIEEE VR
CountryJapan
CityOsaka
Period23/03/1927/03/19
Internet address

Cite this

Tarko, J., Tompkin, J., & Richardt, C. (Accepted/In press). OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras. Poster session presented at IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan.

OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras. / Tarko, Joanna; Tompkin, James; Richardt, Christian.

2019. Poster session presented at IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan.

Research output: Contribution to conferencePoster

Tarko, J, Tompkin, J & Richardt, C 2019, 'OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras' IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan, 23/03/19 - 27/03/19, .
Tarko J, Tompkin J, Richardt C. OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras. 2019. Poster session presented at IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan.
Tarko, Joanna ; Tompkin, James ; Richardt, Christian. / OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras. Poster session presented at IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, Japan.2 p.
@conference{9868e09da4f94b82a93c97ecdf8b2339,
title = "OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras",
abstract = "We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce real-world lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.",
author = "Joanna Tarko and James Tompkin and Christian Richardt",
year = "2019",
month = "2",
day = "21",
language = "English",
note = "IEEE Conference on Virtual Reality and 3D User Interfaces, IEEE VR ; Conference date: 23-03-2019 Through 27-03-2019",
url = "http://ieeevr.org/2019/",

}

TY - CONF

T1 - OmniMR: Omnidirectional Mixed Reality with Spatially-Varying Environment Reflections from Moving 360° Video Cameras

AU - Tarko, Joanna

AU - Tompkin, James

AU - Richardt, Christian

PY - 2019/2/21

Y1 - 2019/2/21

N2 - We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce real-world lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.

AB - We propose a new approach for creating omnidirectional mixed reality (OmniMR) from moving-camera 360° video. To insert virtual computer-generated elements into a moving-camera 360° video, we reconstruct camera motion and sparse scene content via structure from motion on stitched equirectangular video (the default output format of current 360° cameras). Then, to plausibly reproduce real-world lighting conditions for these inserted elements, we employ inverse tone mapping to recover high dynamic range environment maps which vary spatially along the camera path. We implement our approach into the Unity rendering engine for real-time object rendering with dynamic lighting and user interaction. This expands the use and flexibility of 360° video for mixed reality.

M3 - Poster

ER -