Fellowship - Towards Immersive 360° VR Video with Motion Parallax

  • Richardt, Christian (PI)

Project: Research council

Project Details

Description

The goal of my Innovation Fellowship is to create a new form of immersive 360-degree VR video. We are massive consumers of visual information, and as new forms of visual media and immersive technologies are emerging, I want to work towards my vision of making people feel truly immersed in this new form of video content. Imagine, for instance, what it would be like to experience the International Space Station as if you were there - without leaving the comfort of your own home.

The Problem:
To feel truly immersed in virtual reality, one needs to be able to freely look around within a virtual environment and see it from the viewpoints of one's own eyes. Immersion requires 'freedom of motion' in six degrees-of-freedom ('6-DoF'), so that viewers see the correct views of an environment. As viewers move their heads, the objects they see should move relative to each other, with different speeds depending on their distance to the viewer. This is called motion parallax.
Viewers need to perceive correct motion parallax regardless of where they are (3 DoF) and where they are looking (+3 DoF). Currently, only computer-generated imagery (CGI) fully supports 6-DoF content with motion parallax, but it remains extremely challenging to match the visual realism of the real world with computer graphics models. Viewers therefore either lose photorealism (with CGI) or immersion (with existing VR video). To date, it is not possible to capture or view high-quality 6-DoF VR video of the real world.

My Goal:
Virtual reality is a new kind of medium that requires new ways to author content. My goal is therefore to create a new form of immersive 360-degree VR video that overcomes the limitations of existing 360-degree VR video. This new form of VR content - 6-DoF VR video - will achieve unparalleled realism and immersion by providing freedom of head motion and motion parallax, which is a vital depth cue for the human visual system and entirely missing from existing 360-degree VR video.
Specifically, the aim of this Fellowship is to accurately and comprehensively capture real-world environments, including visual dynamics such as people and moving animals or plants, and to reproduce the captured environments and their dynamics in VR with photographic realism, correct motion parallax and overall depth perception. 6-DoF VR video is a significant virtual reality capability that will be a significant step forward for overall immersion, realism and quality of experience.

My Approach:
To achieve 6-DoF VR video that enables photorealistic exploration of dynamic real environments in 360-degree virtual reality, my group and I will develop novel video-based capture, 3D reconstruction and rendering techniques. We first explore different approaches for capturing static and dynamic 360-degree environments, which are more challenging, including using 360 cameras and multi-camera rigs. We next reconstruct the 3D geometry of the environments from the captured imagery by extending multi-view geometry/photogrammetry techniques to handle dynamic 360-degree environments. Extending image-based rendering to 360-degree environments will enable 6-DoF motion within a photorealistic 360-degree environment with high visual fidelity, and will result in detailed 360-degree environments covering all possible viewing directions. We first target 6-DoF 360-degree VR photographs (i.e. static scenes) and then extend our approach to 6-DoF VR videos.

Project partners:
This Fellowship is supported by the following project partners in the UK and abroad: Foundry (London) is a leading developer of visual effects software for film, video and VR post-production, and ideally suited to advise on industrial impact. REWIND (St Albans) is a leading cutting-edge creative VR production company that is keen to experiment with 6-DoF VR video. Reality7 (Hamburg, Germany) is a start-up working on cinematic VR video.
StatusFinished
Effective start/end date25/06/1824/12/21

Collaborative partners

Funding

  • Engineering and Physical Sciences Research Council

UN Sustainable Development Goals

In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):

  • SDG 3 - Good Health and Well-being

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.
  • 360MonoDepth: High-Resolution 360° Monocular Depth Estimation

    Rey-Area, M., Yuan, M. & Richardt, C., 27 Sept 2022, Proceedings - 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2022. IEEE, p. 3752-3762 11 p. 9879016. (Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition; vol. 2022-June).

    Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

    Open Access
    File
    46 Citations (SciVal)
    242 Downloads (Pure)
  • Egocentric scene reconstruction from an omnidirectional video

    Jang, H., Meuleman, A., Kang, D., Kim, D., Richardt, C. & Kim, M. H., 31 Jul 2022, In: ACM Transactions on Graphics. 41, 4, p. 1-12 100.

    Research output: Contribution to journalArticlepeer-review

    Open Access
    9 Citations (SciVal)
  • 360° Optical Flow using Tangent Images

    Yuan, M. & Richardt, C., 22 Nov 2021, British Machine Vision Conference : (BMVC). Christian Richardt, Vol. 2021. 14 p.

    Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

    Open Access
    File
    211 Downloads (Pure)