Projects per year
Abstract
Immersive virtual reality (VR) experiences may track both a user's hands and a physical object at the same time and use the information to animate computer generated representations of the two interacting. However, to render visually without artefacts requires highly accurate tracking of the hands and the objects themselves as well as their relative locations - made even more difficult when the objects are articulated or deformable. If this tracking is incorrect, then the quality and immersion of the visual experience is reduced. In this paper we turn the problem around - instead of focusing on producing quality renders of hand-object interactions by improving tracking quality, we acknowledge there will be tracking errors and just focus on fixing the visualisations. We propose a Deep Neural Network (DNN) that modifies hand pose based on its relative position with the object. However, to train the network we require sufficient labelled data. We therefore also present a new dataset of hand-object interactions - Ego-Interaction. This is the first hand-object interaction dataset with egocentric RGBD videos and 3D ground truth data for both rigid and non-rigid objects. The Ego-Interaction dataset contains 92 sequences with 4 rigid, 1 articulated and 4 non-rigid objects and demonstrates hand-object interactions with 1 and 2 hands carefully captured, rigged and animated using motion capture. We provide our dataset as a general resource for researchers in the VR and AI community interested in other hand-object and egocentric tracking related problems.
Original language | English |
---|---|
Title of host publication | Proceedings - MIG 2021 |
Subtitle of host publication | 14th ACM SIGGRAPH Conference on Motion, Interaction, and Games |
Editors | Stephen N. Spencer |
Place of Publication | U. S. A. |
Publisher | Association for Computing Machinery |
Pages | 1-8 |
Number of pages | 8 |
ISBN (Electronic) | 9781450391313 |
DOIs | |
Publication status | Published - 10 Nov 2021 |
Event | 14th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2021 - Virtual, Online, Switzerland Duration: 10 Nov 2021 → 12 Nov 2021 |
Publication series
Name | Proceedings - MIG 2021: 14th ACM SIGGRAPH Conference on Motion, Interaction, and Games |
---|---|
ISSN (Print) | 2376-1180 |
Conference
Conference | 14th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2021 |
---|---|
Country/Territory | Switzerland |
City | Virtual, Online |
Period | 10/11/21 → 12/11/21 |
Keywords
- dataset
- hand pose
- non-rigid objects
- virtual reality
ASJC Scopus subject areas
- Computer Science Applications
- Human-Computer Interaction
- Education
Fingerprint
Dive into the research topics of 'Ego-Interaction: Visual Hand-Object Pose Correction for VR Experiences'. Together they form a unique fingerprint.-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) - 2.0
Campbell, N. (PI), Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Cosker, D. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Petrini, K. (CoI), Proulx, M. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/11/20 → 31/10/25
Project: Research council
-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA)
Cosker, D. (PI), Bilzon, J. (CoI), Campbell, N. (CoI), Cazzola, D. (CoI), Colyer, S. (CoI), Fincham Haines, T. (CoI), Hall, P. (CoI), Kim, K. I. (CoI), Lutteroth, C. (CoI), McGuigan, P. (CoI), O'Neill, E. (CoI), Richardt, C. (CoI), Salo, A. (CoI), Seminati, E. (CoI), Tabor, A. (CoI) & Yang, Y. (CoI)
Engineering and Physical Sciences Research Council
1/09/15 → 28/02/21
Project: Research council