Projects per year
Abstract
Despite the growing interest in virtual and augmented reality (VR/AR), there are only a small number of limited approaches to transport a physical object into a virtual environment to be used within a VR or AR experience. An external sensor can be attached to an object to capture the 3D position and orientation but offers no information about the non-rigid behaviour of the object. On the other hand, sparse markers can be tracked to drive a rigged model. However, this approach is sensitive to changes in positions and occlusions and often involves costly non-standard hardware. To address these limitations, we propose an end-to-end pipeline for creating interactive virtual props from real-world physical objects. Within this pipeline we explore two methods for tracking our physical objects. The first is a multi-camera RGB system which tracks the 3D centroids of the coloured parts of an object, then uses a feed-forward neural network to infer deformations from these centroids. We also propose a single RGBD camera approach using VRProp-Net, a custom convolutional neural network, designed for tracking rigid and non-rigid objects in unlabelled RGB images. We find both approaches to have advantages and disadvantages. While frame-rates are similar, the multi-view system offers a larger tracking volume. On the other hand, the single camera approach is more portable, does not require calibration and more accurately predicts the deformation parameters.
Original language | English |
---|---|
Title of host publication | ACM Symposium on Computer Animation |
Publication status | Unpublished - 2019 |
Fingerprint Dive into the research topics of 'Transporting Real Objects into Virtual and Augmented Environments'. Together they form a unique fingerprint.
Projects
- 1 Finished
-
Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA)
Cosker, D., Campbell, N., Fincham Haines, T., Hall, P., Kim, K. I., Lutteroth, C., O'Neill, E., Richardt, C. & Yang, Y.
Engineering and Physical Sciences Research Council
1/09/15 → 28/02/21
Project: Research council