Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.
- Multisensory integration
- Point-light displays
- Social interactions
FingerprintDive into the research topics of 'Audiovisual integration of emotional signals from others' social interactions'. Together they form a unique fingerprint.
- Management - Senior Lecturer (Associate Professor)
- Information, Decisions & Operations
- Bath Centre for Healthcare Innovation and Improvement
- Applied Digital Behaviour Lab
- EPSRC Centre for Doctoral Training in Cyber Security
- Centre for Future of Work
Person: Research & Teaching