Towards Artificial Emotions to Assist Social Coordination in HRI

Jekaterina Novikova, Leon Watts

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)
161 Downloads (Pure)

Abstract

Coordination of human–robot joint activity must depend on the ability of human and artificial agencies to interpret and interleave their actions. In this paper we consider the potential of artificial emotions to serve as task-relevant coordination devices in human–robot teams. We present two studies aiming to understand whether a non-humanoid robot can express artificial emotions in a manner that is meaningful to a human observer, the first based on static images and the second on the dynamic production of embodied robot expressions. We present a mixed-methods approach to the problem, combining statistical treatment of ratings data and thematic analysis of qualitative data. Our results demonstrate that even very simple movements of a non-humanoid robot can convey emotional meaning, and that when people attribute emotional states to a robot, they typically apply an event-based frame to make sense of the robotic expressions they have seen. Artificial emotions with high arousal level and negative valence are relatively easy for people to recognise compared to expressions with positive valence. We discuss the potential for using motion in different parts of a non-humanoid robot body to support the attribution of emotion in HRI, towards ethically responsible design of artificial emotions that could contribute to the efficacy of joint human–robot activities.
Original languageEnglish
Pages (from-to)77-88
Number of pages12
JournalInternational Journal of Social Robotics
Volume7
Issue number1
Early online date7 Oct 2014
DOIs
Publication statusPublished - Feb 2015

Fingerprint Dive into the research topics of 'Towards Artificial Emotions to Assist Social Coordination in HRI'. Together they form a unique fingerprint.

Cite this