Multisensory wearable interface for immersion and telepresence in robotics

Uriel Martinez Hernandez, Luke W. Boorman, Tony J. Prescott

Research output: Contribution to journalArticlepeer-review

45 Downloads (Pure)

Abstract

The idea of being present in a remote location has inspired researchers to develop robotic devices, that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this paper, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch, and audio, and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually exploring the remote environment. We validated our paper with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching, and listening a remote environment. In our experiments, we used two different robotic platforms: 1) the iCub humanoid robot and 2) the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use, and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment.
Original languageEnglish
Pages (from-to)2534-2541
Number of pages8
JournalIEEE Sensors Journal
Volume17
Issue number8
Early online date14 Feb 2017
DOIs
Publication statusPublished - 15 Apr 2017

Fingerprint Dive into the research topics of 'Multisensory wearable interface for immersion and telepresence in robotics'. Together they form a unique fingerprint.

Cite this