Projects per year
Abstract
The ubiquity of smart mobile devices, such as phones and tablets, enables users to casually capture 360° panoramas with a single camera sweep to share and relive experiences. However, panoramas lack motion parallax as they do not provide different views for different viewpoints. The motion parallax induced by translational head motion is a crucial depth cue in daily life. Alternatives, such as omnidirectional stereo panoramas, provide different views for each eye (binocular disparity), but they also lack motion parallax as the left and right eye panoramas are stitched statically. Methods based on explicit scene geometry reconstruct textured 3D geometry, which provides motion parallax, but suffers from visible reconstruction artefacts. The core of our method is a novel multi-perspective panorama representation, which can be casually captured and rendered with motion parallax for each eye on the fly. This provides a more realistic perception of panoramic environments which is particularly useful for virtual reality applications. Our approach uses a single consumer video camera to acquire 200–400 views of a real 360° environment with a single sweep. By using novel-view synthesis with flow-based blending, we show how to turn these input views into an enriched 360° panoramic experience that can be explored in real time, without relying on potentially unreliable reconstruction of scene geometry. We compare our results with existing omnidirectional stereo and image-based rendering methods to demonstrate the benefit of our approach, which is the first to enable casual consumers to capture and view high-quality 360° panoramas with motion parallax.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
Original language | English |
---|---|
Pages (from-to) | 1828-1835 |
Number of pages | 8 |
Journal | IEEE Transactions on Visualization and Computer Graphics |
Volume | 25 |
Issue number | 5 |
Early online date | 25 Feb 2019 |
DOIs | |
Publication status | Published - 31 May 2019 |
Event | IEEE Conference on Virtual Reality and 3D User Interfaces - Osaka, Japan Duration: 23 Mar 2019 → 27 Mar 2019 Conference number: 26 http://ieeevr.org/2019/ |
Keywords
- casual 360° scene capture
- plenoptic modeling
- image-based rendering
- novel-view synthesis
- virtual reality
Fingerprint
Dive into the research topics of 'MegaParallax: Casual 360° Panoramas with Motion Parallax'. Together they form a unique fingerprint.Projects
- 3 Finished
-
Fellowship - Towards Immersive 360° VR Video with Motion Parallax
Richardt, C. (PI)
Engineering and Physical Sciences Research Council
25/06/18 → 24/12/21
Project: Research council
-
Fellow for Industrial Research Enhancement (FIRE)
Scott, J. L. (PI) & Yang, Y. (CoI)
1/10/15 → 30/03/21
Project: EU Commission
-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA)
Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Fincham Haines, T. (CoI), Hall, P. (CoI), Kim, K. I. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Richardt, C. (CoI), Salo, A. (CoI), Seminati, E. (CoI), Tabor, A. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/09/15 → 28/02/21
Project: Research council