In this paper we present an architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface that provides the human user with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, including vision (with gaze control) and tactile feedback, which offers a richly immersive experience for the human user. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with others located in the remote environment. Our approach has been tested from a variety of distances, including university and business premises, and using wired, wireless and Internet based connections, using data compression to maintain the quality of the experience for the user. Initial testing has shown the wearable interface to be a robust system of immersive teleoperation, with a myriad of potential applications, particularly in social networking, gaming and entertainment.
|Title of host publication||Interactivity, Game Creation, Design, Learning and Innovation. ArtsIT 2016, DLI 2016.|
|Editors||A. Brooks, E. Brooks|
|Number of pages||9|
|Publication status||Published - 18 Mar 2017|
|Name||Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering|
Martinez Hernandez, U., Szollosy, M., Boorman, L. W., Kerdegari, H., & Prescott, T. J. (2017). Towards a wearable interface for immersive telepresence in Robotics. In A. Brooks, & E. Brooks (Eds.), Interactivity, Game Creation, Design, Learning and Innovation. ArtsIT 2016, DLI 2016. (pp. 65-73). (Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Vol. 196). Springer. https://doi.org/10.1007/978-3-319-55834-9_8