Free-Viewpoint Facial Re-Enactment from a Casual Capture

Srinivas Rao, Rodrigo Ortiz-Cayon, Matteo Munaro, Aidas Liaudanskas, Krunal Chande, Tobias Bertel, Christian Richardt, Alexander J. B. Trevor, Stefan Holzer, Abhishek Kar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

33 Downloads (Pure)

Abstract

We propose a system for free-viewpoint facial re-enactment from a casual video capture of a target subject. Our system can render and re-enact the subject consistently in all the captured views. Furthermore, our system also enables interactive free-viewpoint facial re-enactment of the target from novel views. The re-enactment of the target subject is driven by an expression sequence of a source subject, which is captured using a custom app running on an iPhone X. Our system handles large pose variations in the target subject while keeping the re-enactment consistent. We demonstrate the efficacy of our system by showing various applications.


This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 665992
Original languageEnglish
Title of host publicationSA '20 Posters: SIGGRAPH Asia 2020 Posters
PublisherAssociation for Computing Machinery
Pages1-2
Number of pages2
DOIs
Publication statusPublished - 31 Dec 2020
EventACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia - Online, Online, Korea, Republic of
Duration: 4 Dec 202013 Dec 2020
Conference number: 13
https://sa2020.siggraph.org/en/

Conference

ConferenceACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia
Abbreviated titleSIGGRAPH Asia
Country/TerritoryKorea, Republic of
CityOnline
Period4/12/2013/12/20
Internet address

Keywords

  • facial reenactment
  • neural network
  • image-based rendering

Fingerprint

Dive into the research topics of 'Free-Viewpoint Facial Re-Enactment from a Casual Capture'. Together they form a unique fingerprint.

Cite this