In the emerging world of human-robot interaction, social robotics has become ever more important. Social robotics is a fundamental area in such domains as healthcare and medical robotics, consumer robotics or service robotics. Social robots working among humans should be able to communicate naturally with people using not only verbal but also non-verbal signals. Some cues of non-verbal body language, associated with affect and emotions, have an evolutionary root in humans that allows them to signal their unobservable internal state and intentions to others around them. An ability to interpret an internal state and intentions of team-members or counterparts is important not only in human-human but also in human-robot teams. One possible way to contribute to understandability is for robots to make their otherwise unobservable internal state interpretable to people through the use of emotionally expressive body language. This makes robots more predictable, acceptable and likeable, thus, in the end, having a potential to make them more effective team-players. This thesis addresses the problem of enabling humans to better understand machines by examining the role of artificial emotions synthesized and expressed by robots in the process of human-robot interaction. In our first study, we probe whether it is possible to signal a wanted emotional meaning through bodily expressions of a non-humanoid robot. The results provide strong support for the potential utility of bodily expressions in robots for communicating emotional meaning to people. A set of design parameters was developed from an analysis of research on non-verbal expression of emotion in the animal world. In the next two studies, we explore how this set of design parameters impacts how people perceive the emotional meaning of a robot expression, and investigate the nature and dynamics of peoples' perception of emotion expressed in a robot through its bodily movements. The results provide the basis for a mapping between the different design parameters of a robot's bodily expression and emotional interpretations. In addition, the results of the study show that people perceive emotionally expressive robots as more anthropomorphic, more animate, more likeable, more responsible and even more intelligent. In two next studies we investigate two major factors that may have influence on the perception of robot emotions. In one study, we investigate how the particular situational context in which expressions are used by the robot influences how they are perceived and interpreted by people. Another major factor to investigate is how the morphology of a robot performing emotional expressions influences how these expressions are interpreted and whether people are consistent in the emotional meaning they perceive. Finally, having a coherent design scheme to produce meaningful emotional expressions through robot body movements, we investigate in our last study the impact of such expressions on people's attitudes towards a robot. The results of the work provide evidence of the impact of the robot's emotional expressiveness on the perception of their anthropomorphism, animacy, likeability and intelligence.Results of our work are discussed in terms of the utility of expressive behaviour for facilitating human understanding of robot intentions and the directions for the future development in the design of cues for emotionally expressive robot behaviour.
|Date of Award||1 Mar 2016|
|Supervisor||Leon Watts (Supervisor) & Joanna Bryson (Supervisor)|