VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment

Catherine Taylor, Chris Mullanay, Robin McNicholas, Darren Cosker

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Downloads (Pure)

Abstract

Improvements in both software and hardware, as well as an increase in consumer suitable equipment, have resulted in great advances in the fields of virtual and augmented reality. Typically, systems use controllersorhandgesturestointeractwithvirtualobjects. However, these motions are often unnatural and diminish the immersion of the experience. Moreover, these approaches offer limited tactile feedback. There does not currently exist a platform to bring an arbitrary physical object into the virtual world without additional peripherals ortheuseofexpensivemotioncapturesystems. Suchasystemcould be used for immersive experiences within the entertainment industry as well as being applied to VR or AR training experiences, in the fields of health and engineering. We propose an end-to-end pipeline for creating an interactive virtual prop from rigid and non-rigid physical objects. This includes a novel method for tracking the deformations of rigid and non-rigid objects at interactive rates using a single RGBD camera. We scan our physical object and process the point cloud to produce a triangular mesh. A range of possible deformations can be obtained by using a finite element method simulation and these are reduced to a low dimensional basis using principal component analysis. Machine learning approaches, in particular neural networks, have become key tools in computer vision and have been used on a range of tasks. Moreover, there has been an increased trend in training networks on synthetic data. To this end, we use a convolutional neural network, trained on synthetic data, to track the movement and potential deformations of an object in unlabelled RGB images from a single RGBD camera. We demonstrate our results for several objects with different sizes and appearances.
Original languageEnglish
Title of host publicationInternational Symposium on Mixed and Augmented Reality
PublisherIEEE
Publication statusAccepted/In press - 1 Jun 2019

Publication series

Name Mixed and Augmented Reality (ISMAR), International Symposium on .
PublisherIEEE
ISSN (Print)2473-0726

Cite this

Taylor, C., Mullanay, C., McNicholas, R., & Cosker, D. (Accepted/In press). VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment. In International Symposium on Mixed and Augmented Reality ( Mixed and Augmented Reality (ISMAR), International Symposium on .). IEEE.

VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment. / Taylor, Catherine; Mullanay, Chris; McNicholas, Robin; Cosker, Darren.

International Symposium on Mixed and Augmented Reality. IEEE, 2019. ( Mixed and Augmented Reality (ISMAR), International Symposium on .).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Taylor, C, Mullanay, C, McNicholas, R & Cosker, D 2019, VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment. in International Symposium on Mixed and Augmented Reality. Mixed and Augmented Reality (ISMAR), International Symposium on ., IEEE.
Taylor C, Mullanay C, McNicholas R, Cosker D. VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment. In International Symposium on Mixed and Augmented Reality. IEEE. 2019. ( Mixed and Augmented Reality (ISMAR), International Symposium on .).
Taylor, Catherine ; Mullanay, Chris ; McNicholas, Robin ; Cosker, Darren. / VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment. International Symposium on Mixed and Augmented Reality. IEEE, 2019. ( Mixed and Augmented Reality (ISMAR), International Symposium on .).
@inproceedings{9a027fc2b64f45d6a43ff81ccdeeda24,
title = "VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment",
abstract = "Improvements in both software and hardware, as well as an increase in consumer suitable equipment, have resulted in great advances in the fields of virtual and augmented reality. Typically, systems use controllersorhandgesturestointeractwithvirtualobjects. However, these motions are often unnatural and diminish the immersion of the experience. Moreover, these approaches offer limited tactile feedback. There does not currently exist a platform to bring an arbitrary physical object into the virtual world without additional peripherals ortheuseofexpensivemotioncapturesystems. Suchasystemcould be used for immersive experiences within the entertainment industry as well as being applied to VR or AR training experiences, in the fields of health and engineering. We propose an end-to-end pipeline for creating an interactive virtual prop from rigid and non-rigid physical objects. This includes a novel method for tracking the deformations of rigid and non-rigid objects at interactive rates using a single RGBD camera. We scan our physical object and process the point cloud to produce a triangular mesh. A range of possible deformations can be obtained by using a finite element method simulation and these are reduced to a low dimensional basis using principal component analysis. Machine learning approaches, in particular neural networks, have become key tools in computer vision and have been used on a range of tasks. Moreover, there has been an increased trend in training networks on synthetic data. To this end, we use a convolutional neural network, trained on synthetic data, to track the movement and potential deformations of an object in unlabelled RGB images from a single RGBD camera. We demonstrate our results for several objects with different sizes and appearances.",
author = "Catherine Taylor and Chris Mullanay and Robin McNicholas and Darren Cosker",
year = "2019",
month = "6",
day = "1",
language = "English",
series = "Mixed and Augmented Reality (ISMAR), International Symposium on .",
publisher = "IEEE",
booktitle = "International Symposium on Mixed and Augmented Reality",
address = "USA United States",

}

TY - GEN

T1 - VR Props: An End-to-End Pipeline for Transporting Real Objects into Virtual and Augmented Environment

AU - Taylor, Catherine

AU - Mullanay, Chris

AU - McNicholas, Robin

AU - Cosker, Darren

PY - 2019/6/1

Y1 - 2019/6/1

N2 - Improvements in both software and hardware, as well as an increase in consumer suitable equipment, have resulted in great advances in the fields of virtual and augmented reality. Typically, systems use controllersorhandgesturestointeractwithvirtualobjects. However, these motions are often unnatural and diminish the immersion of the experience. Moreover, these approaches offer limited tactile feedback. There does not currently exist a platform to bring an arbitrary physical object into the virtual world without additional peripherals ortheuseofexpensivemotioncapturesystems. Suchasystemcould be used for immersive experiences within the entertainment industry as well as being applied to VR or AR training experiences, in the fields of health and engineering. We propose an end-to-end pipeline for creating an interactive virtual prop from rigid and non-rigid physical objects. This includes a novel method for tracking the deformations of rigid and non-rigid objects at interactive rates using a single RGBD camera. We scan our physical object and process the point cloud to produce a triangular mesh. A range of possible deformations can be obtained by using a finite element method simulation and these are reduced to a low dimensional basis using principal component analysis. Machine learning approaches, in particular neural networks, have become key tools in computer vision and have been used on a range of tasks. Moreover, there has been an increased trend in training networks on synthetic data. To this end, we use a convolutional neural network, trained on synthetic data, to track the movement and potential deformations of an object in unlabelled RGB images from a single RGBD camera. We demonstrate our results for several objects with different sizes and appearances.

AB - Improvements in both software and hardware, as well as an increase in consumer suitable equipment, have resulted in great advances in the fields of virtual and augmented reality. Typically, systems use controllersorhandgesturestointeractwithvirtualobjects. However, these motions are often unnatural and diminish the immersion of the experience. Moreover, these approaches offer limited tactile feedback. There does not currently exist a platform to bring an arbitrary physical object into the virtual world without additional peripherals ortheuseofexpensivemotioncapturesystems. Suchasystemcould be used for immersive experiences within the entertainment industry as well as being applied to VR or AR training experiences, in the fields of health and engineering. We propose an end-to-end pipeline for creating an interactive virtual prop from rigid and non-rigid physical objects. This includes a novel method for tracking the deformations of rigid and non-rigid objects at interactive rates using a single RGBD camera. We scan our physical object and process the point cloud to produce a triangular mesh. A range of possible deformations can be obtained by using a finite element method simulation and these are reduced to a low dimensional basis using principal component analysis. Machine learning approaches, in particular neural networks, have become key tools in computer vision and have been used on a range of tasks. Moreover, there has been an increased trend in training networks on synthetic data. To this end, we use a convolutional neural network, trained on synthetic data, to track the movement and potential deformations of an object in unlabelled RGB images from a single RGBD camera. We demonstrate our results for several objects with different sizes and appearances.

M3 - Conference contribution

T3 - Mixed and Augmented Reality (ISMAR), International Symposium on .

BT - International Symposium on Mixed and Augmented Reality

PB - IEEE

ER -