Projects per year
Abstract
Many VR animal embodiment sytsems suffer from poor animation fidelity, typically animating the animal avatars using inverse kinematics. We address this issue, presenting a novel deep-learning method, centred around a shared codebook, for mapping human motion to quadruped motion. Rather than trying to directly bridge the gap from human motion to quadruped motion, a task which has proven difficult, we first use a rule-based retargeter, relying on inverse and forward kinematics, to retarget human motions to an intermediate motion domain in which the motions share the same skeleton as the quadruped. We then use finite scalar quantization to construct a shared latent space, or codebook, between this intermediate domain and the quadruped motion domain. We do this by first pre-defining a finite number of discrete latent codes and then teaching these codes, using unsupervised deep-learning, to represent semantically similar motions in the two domains. We incorporate our real-time human-to-quadruped motion mapping into a VR quadruped embodiment system. The output quadruped animations are natural and realistic, while also preserving the semantics of users' actions. Moreover, there is a strong synchrony between the input human motions and retargeted quadruped motions, an important factor for inducing a strong sense of VR embodiment.
Original language | English |
---|---|
Title of host publication | Proceedings, MIG 2024 - 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games |
Editors | Stephen N. Spencer |
Place of Publication | U. S. A. |
Publisher | Association for Computing Machinery |
Pages | 1-11 |
ISBN (Electronic) | 9798400710902 |
DOIs | |
Publication status | Published - 21 Nov 2024 |
Event | 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2024 - Arlington, USA United States Duration: 21 Nov 2024 → 23 Nov 2024 |
Publication series
Name | Proceedings, MIG 2024 - 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games |
---|
Conference
Conference | 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2024 |
---|---|
Country/Territory | USA United States |
City | Arlington |
Period | 21/11/24 → 23/11/24 |
Keywords
- Deep-learning
- Motion retargeting
- Quadruped embodiment
- VR embodiment
ASJC Scopus subject areas
- Computer Science Applications
- Human-Computer Interaction
- Education
Fingerprint
Dive into the research topics of 'Dog Code: Human to Quadruped Embodiment using Shared Codebooks'. Together they form a unique fingerprint.-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) - 2.0
Campbell, N. (PI), Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Cosker, D. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Petrini, K. (CoI), Proulx, M. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/11/20 → 31/10/25
Project: Research council
-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA)
Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Fincham Haines, T. (CoI), Hall, P. (CoI), Kim, K. I. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Richardt, C. (CoI), Salo, A. (CoI), Seminati, E. (CoI), Tabor, A. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/09/15 → 28/02/21
Project: Research council