Casual Real-World VR using Light Fields

Yusuke Tomoto, Srinivas Rao, Tobias Bertel, Krunal Chande, Christian Richardt, Stefan Holzer, Rodrigo Ortiz-Cayon

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

1 Citation (SciVal)
81 Downloads (Pure)


Virtual reality (VR) would benefit from more end-to-end systems centered around a casual capturing procedure, high-quality visual results, and representations that are viewable on multiple platforms. We present an end-to-end system that is designed for casual creation of real-world VR content, using a smartphone. We use an AR app to capture a linear light field of a real-world object by recording a video sweep around the object. We predict multiplane images for a subset of input viewpoints, from which we extract high-quality textured geometry that are used for real-time image-based rendering suitable for VR. The round-trip time of our system, from guided capture to interactive display, is typically 1–2 minutes per scene.

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
Original languageEnglish
Title of host publicationSA'20 Posters: SIGGRAPH Asia 2020 Posters
PublisherAssociation for Computing Machinery
Number of pages2
Publication statusPublished - 31 Dec 2020
EventACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia - Online, Online, Korea, Republic of
Duration: 4 Dec 202013 Dec 2020
Conference number: 13


ConferenceACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia
Abbreviated titleSIGGRAPH Asia
Country/TerritoryKorea, Republic of
Internet address


  • virtual reality
  • VR photography
  • view synthesis
  • multiplane images


Dive into the research topics of 'Casual Real-World VR using Light Fields'. Together they form a unique fingerprint.

Cite this