Audiovisual integration of emotional signals from others' social interactions

Lukasz Piwek, Frank Pollick, Karin Petrini

Research output: Contribution to journalArticlepeer-review

24 Citations (SciVal)


Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.

Original languageEnglish
Article number611
JournalFrontiers in Psychology
Early online date8 May 2015
Publication statusPublished - 2015


  • Anger
  • Happiness
  • Multisensory integration
  • Point-light displays
  • Social interactions
  • Voice


Dive into the research topics of 'Audiovisual integration of emotional signals from others' social interactions'. Together they form a unique fingerprint.

Cite this