In this paper, we present a Bayesian approach for perception of touch and control of robot emotion. Touch is an important sensing modality for the development of social robots, and it is used in this work as stimulus through a human-robot interaction. A Bayesian framework is proposed for perception of various types of touch. This method together with a sequential analysis approach allow the robot to accumulate evidence from the interaction with humans to achieve accurate touch perception for adaptable control of robot emotions. Facial expressions are used to represent the emotions of the iCub humanoid. Emotions in the robotic platform, based on facial expressions, are handled by a control architecture that works with the output from the touch perception process. We validate the accuracy of our system with simulated and real robot touch experiments. Results from this work show that our method is suitable and accurate for perception of touch to control robot emotions, which is essential for the development of sociable robots.
Martinez Hernandez, U., Rubio-Solis, A., & Prescott, T. J. (2016). Bayesian perception of touch for control of robot emotion. In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 4927-4933). IEEE. https://doi.org/10.1109/IJCNN.2016.7727848