360° 3D Photos from a Single 360° Input Image

Manuel Rey-Area, Christian Richardt

Research output: Contribution to journalArticlepeer-review

Abstract

360° images are a popular medium for bringing photography into virtual reality. While users can look in any direction by rotating their heads, 360° images ultimately look flat. That is because they lack depth information and thus cannot create motion parallax when translating the head. To achieve a fully immersive VR experience from a single 360° image, we introduce a novel method to upgrade 360° images to free-viewpoint renderings with 6 degrees of freedom. Alternative approaches reconstruct textured 3D geometry, which is fast to render but suffers from visible reconstruction artifacts, or use neural radiance fields that produce high-quality novel views but too slowly for VR applications. Our 360° 3D photos build on 3D Gaussian splatting as the underlying scene representation to simultaneously achieve high visual quality and real-time rendering speed. To fill plausible content in previously unseen regions, we introduce a novel combination of latent diffusion inpainting and monocular depth estimation with Poisson-based blending. Our results demonstrate state-of-the-art visual and depth quality at rendering rates of 105 FPS per megapixel on a commodity GPU.
Original languageEnglish
JournalIEEE Transactions on Visualization and Computer Graphics
Early online date18 Mar 2025
DOIs
Publication statusE-pub ahead of print - 18 Mar 2025

Acknowledgements

We would like to thank Wenbin Li for his support.

Keywords

  • novel-view synthesis
  • inpainting
  • real time

Fingerprint

Dive into the research topics of '360° 3D Photos from a Single 360° Input Image'. Together they form a unique fingerprint.

Cite this