Projects per year
Improvements in both software and hardware, as well as an increase in consumer suitable equipment, have resulted in great advances in the ﬁelds of virtual and augmented reality. Typically, systems use controllersorhandgesturestointeractwithvirtualobjects. However, these motions are often unnatural and diminish the immersion of the experience. Moreover, these approaches offer limited tactile feedback. There does not currently exist a platform to bring an arbitrary physical object into the virtual world without additional peripherals ortheuseofexpensivemotioncapturesystems. Suchasystemcould be used for immersive experiences within the entertainment industry as well as being applied to VR or AR training experiences, in the ﬁelds of health and engineering. We propose an end-to-end pipeline for creating an interactive virtual prop from rigid and non-rigid physical objects. This includes a novel method for tracking the deformations of rigid and non-rigid objects at interactive rates using a single RGBD camera. We scan our physical object and process the point cloud to produce a triangular mesh. A range of possible deformations can be obtained by using a ﬁnite element method simulation and these are reduced to a low dimensional basis using principal component analysis. Machine learning approaches, in particular neural networks, have become key tools in computer vision and have been used on a range of tasks. Moreover, there has been an increased trend in training networks on synthetic data. To this end, we use a convolutional neural network, trained on synthetic data, to track the movement and potential deformations of an object in unlabelled RGB images from a single RGBD camera. We demonstrate our results for several objects with different sizes and appearances.
|Title of host publication||International Symposium on Mixed and Augmented Reality|
|Place of Publication||U. S. A.|
|Publication status||Published - 18 Oct 2019|
|Name||Mixed and Augmented Reality (ISMAR), International Symposium on .|
1/09/15 → 28/02/21
Project: Research council