Towards a wearable interface for immersive telepresence in Robotics

Uriel Martinez Hernandez, Michael Szollosy, Luke W. Boorman, Hamideh Kerdegari, Tony J. Prescott

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

3 Citations (SciVal)
64 Downloads (Pure)


In this paper we present an architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface that provides the human user with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, including vision (with gaze control) and tactile feedback, which offers a richly immersive experience for the human user. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with others located in the remote environment. Our approach has been tested from a variety of distances, including university and business premises, and using wired, wireless and Internet based connections, using data compression to maintain the quality of the experience for the user. Initial testing has shown the wearable interface to be a robust system of immersive teleoperation, with a myriad of potential applications, particularly in social networking, gaming and entertainment.
Original languageEnglish
Title of host publicationInteractivity, Game Creation, Design, Learning and Innovation.  ArtsIT 2016, DLI 2016.
EditorsA. Brooks, E. Brooks
Number of pages9
ISBN (Electronic)978-3-319-55834-9
ISBN (Print)978-3-319-55833-2
Publication statusPublished - 18 Mar 2017

Publication series

NameNotes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering


Dive into the research topics of 'Towards a wearable interface for immersive telepresence in Robotics'. Together they form a unique fingerprint.

Cite this