The idea of being present in a remote location has inspired researchers to develop robotic devices, that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this paper, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch, and audio, and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually exploring the remote environment. We validated our paper with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching, and listening a remote environment. In our experiments, we used two different robotic platforms: 1) the iCub humanoid robot and 2) the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use, and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment.
- Department of Electronic & Electrical Engineering - Lecturer
- Centre for Biosensors, Bioelectronics and Biodevices (C3Bio)
- UKRI CDT in Accountable, Responsible and Transparent AI
- Centre for Autonomous Robotics (CENTAUR)
- Electronics Materials, Circuits & Systems Research Unit (EMaCS)
Person: Research & Teaching