Ego-Interaction: Visual Hand-Object Pose Correction for VR Experiences

Catherine Taylor, Murray Evans, Eleanor Crellin, Martin Parsons, Darren Cosker

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

1 Citation (SciVal)


Immersive virtual reality (VR) experiences may track both a user's hands and a physical object at the same time and use the information to animate computer generated representations of the two interacting. However, to render visually without artefacts requires highly accurate tracking of the hands and the objects themselves as well as their relative locations - made even more difficult when the objects are articulated or deformable. If this tracking is incorrect, then the quality and immersion of the visual experience is reduced. In this paper we turn the problem around - instead of focusing on producing quality renders of hand-object interactions by improving tracking quality, we acknowledge there will be tracking errors and just focus on fixing the visualisations. We propose a Deep Neural Network (DNN) that modifies hand pose based on its relative position with the object. However, to train the network we require sufficient labelled data. We therefore also present a new dataset of hand-object interactions - Ego-Interaction. This is the first hand-object interaction dataset with egocentric RGBD videos and 3D ground truth data for both rigid and non-rigid objects. The Ego-Interaction dataset contains 92 sequences with 4 rigid, 1 articulated and 4 non-rigid objects and demonstrates hand-object interactions with 1 and 2 hands carefully captured, rigged and animated using motion capture. We provide our dataset as a general resource for researchers in the VR and AI community interested in other hand-object and egocentric tracking related problems.

Original languageEnglish
Title of host publicationProceedings - MIG 2021
Subtitle of host publication14th ACM SIGGRAPH Conference on Motion, Interaction, and Games
EditorsStephen N. Spencer
Place of PublicationU. S. A.
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450391313
Publication statusPublished - 10 Nov 2021
Event14th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2021 - Virtual, Online, Switzerland
Duration: 10 Nov 202112 Nov 2021

Publication series

NameProceedings - MIG 2021: 14th ACM SIGGRAPH Conference on Motion, Interaction, and Games
ISSN (Print)2376-1180


Conference14th ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG 2021
CityVirtual, Online


  • dataset
  • hand pose
  • non-rigid objects
  • virtual reality

ASJC Scopus subject areas

  • Computer Science Applications
  • Human-Computer Interaction
  • Education


Dive into the research topics of 'Ego-Interaction: Visual Hand-Object Pose Correction for VR Experiences'. Together they form a unique fingerprint.

Cite this