Bayesian perception of touch for control of robot emotion

Uriel Martinez Hernandez, Adrian Rubio-Solis, Tony J. Prescott

Research output: Chapter or section in a book/report/conference proceedingChapter in a published conference proceeding

14 Citations (SciVal)
66 Downloads (Pure)


In this paper, we present a Bayesian approach for perception of touch and control of robot emotion. Touch is an important sensing modality for the development of social robots, and it is used in this work as stimulus through a human-robot interaction. A Bayesian framework is proposed for perception of various types of touch. This method together with a sequential analysis approach allow the robot to accumulate evidence from the interaction with humans to achieve accurate touch perception for adaptable control of robot emotions. Facial expressions are used to represent the emotions of the iCub humanoid. Emotions in the robotic platform, based on facial expressions, are handled by a control architecture that works with the output from the touch perception process. We validate the accuracy of our system with simulated and real robot touch experiments. Results from this work show that our method is suitable and accurate for perception of touch to control robot emotions, which is essential for the development of sociable robots.
Original languageEnglish
Title of host publication2016 International Joint Conference on Neural Networks (IJCNN)
Number of pages7
ISBN (Electronic)2161-4407
Publication statusPublished - 15 Mar 2016


Dive into the research topics of 'Bayesian perception of touch for control of robot emotion'. Together they form a unique fingerprint.

Cite this