Audiovisual integration of emotional signals from others' social interactions

Lukasz Piwek, Frank Pollick, Karin Petrini

Research output: Contribution to journalArticle

  • 4 Citations

Abstract

Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.

LanguageEnglish
Article number611
JournalFrontiers in Psychology
Volume6
Early online date8 May 2015
DOIs
StatusPublished - 2015

Fingerprint

Interpersonal Relations
Emotions
Cues
Surgical Instruments
Recognition (Psychology)

Keywords

  • Anger
  • Happiness
  • Multisensory integration
  • Point-light displays
  • Social interactions
  • Voice

Cite this

Audiovisual integration of emotional signals from others' social interactions. / Piwek, Lukasz; Pollick, Frank; Petrini, Karin.

In: Frontiers in Psychology, Vol. 6, 611, 2015.

Research output: Contribution to journalArticle

@article{13187915f0ba49cbb017de539fb404ec,
title = "Audiovisual integration of emotional signals from others' social interactions",
abstract = "Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.",
keywords = "Anger, Happiness, Multisensory integration, Point-light displays, Social interactions, Voice",
author = "Lukasz Piwek and Frank Pollick and Karin Petrini",
year = "2015",
doi = "10.3389/fpsyg.2015.00611",
language = "English",
volume = "6",
journal = "Frontiers in Psychology: Movement Science and Sport Psychology",
issn = "1664-1078",
publisher = "Frontiers Media S.A.",

}

TY - JOUR

T1 - Audiovisual integration of emotional signals from others' social interactions

AU - Piwek, Lukasz

AU - Pollick, Frank

AU - Petrini, Karin

PY - 2015

Y1 - 2015

N2 - Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.

AB - Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.

KW - Anger

KW - Happiness

KW - Multisensory integration

KW - Point-light displays

KW - Social interactions

KW - Voice

UR - http://www.scopus.com/inward/record.url?scp=84930943388&partnerID=8YFLogxK

UR - http://dx.doi.org/10.3389/fpsyg.2015.00611

U2 - 10.3389/fpsyg.2015.00611

DO - 10.3389/fpsyg.2015.00611

M3 - Article

VL - 6

JO - Frontiers in Psychology: Movement Science and Sport Psychology

T2 - Frontiers in Psychology: Movement Science and Sport Psychology

JF - Frontiers in Psychology: Movement Science and Sport Psychology

SN - 1664-1078

M1 - 611

ER -