Emotion recognition across visual and auditory modalities in autism spectrum disorder: A systematic review and meta-analysis

Florence Y N Leung, Jacqueline Sin, Caitlin Dawson, Jia Hoong Ong, Chen Zhao, Anamarija Veić, Fang Liu

Research output: Contribution to journalArticlepeer-review

23 Citations (SciVal)

Abstract

An expanding literature has investigated emotion recognition across visual and auditory modalities in autism spectrum disorder (ASD). Findings, however, have been highly variable. The present work systematically reviewed and quantitatively synthesised a large body of literature, in order to determine whether autistic individuals differ from their neurotypical counterparts in emotion recognition across human face, nonhuman face, speech, and music domains. To identify eligible studies, the literature was searched using Embase, Medline, PubMed, Web of Science, and Google Scholar. Synthesising data from 72 papers, results showed a general difficulty with emotion recognition accuracy in ASD, while autistic individuals also showed longer response times than their neurotypical (NT) counterparts for a subset of emotions (i.e., anger, fear, sadness, and the six-emotion composite). These impairments were shown to be robust as they were not driven by differences in stimulus presentation time restriction and IQ matching, though the severity of impairments was less pronounced for a subset of emotions when full-scale IQ matching (i.e., anger, fear, happiness, sadness, and disgust) and verbal IQ matching (i.e., anger, fear, sadness, and disgust) had been undertaken. The heterogeneity among studies arose from a combination of sample characteristics (i.e., age but not IQ) and experimental design (i.e., stimulus domain and task demand) parameters. Specifically, we show that (i) impairments were more pronounced in autistic adults; (ii) full-scale, verbal, and nonverbal IQ did not moderate impairments; (iii) emotion-general impairments were found for human faces but emotion-specific impairments were observed for speech prosody (i.e., anger, happiness, and disgust) and music (i.e., fear and sadness), while no impairment was observed for nonhuman faces; (iv) impairments were found across emotions for verbal but not nonverbal tasks. Importantly, further research on the recognition of prosodic, musical, and nonhuman facial emotions is warranted, as the current findings are disproportionately influenced by studies on human faces. Future studies should also continue to explore the different emotion processing strategies employed by autistic individuals, which could be fundamental to promoting fulfilling emotional experiences in real life.
Original languageEnglish
Article number10.1016/j.dr.2021.101000
Pages (from-to)1-47
JournalDevelopmental Review
DOIs
Publication statusPublished - 31 Mar 2022

Keywords

  • autism
  • emotion recognition
  • faces
  • speech prosody
  • music
  • systematic review
  • meta-analysis

Fingerprint

Dive into the research topics of 'Emotion recognition across visual and auditory modalities in autism spectrum disorder: A systematic review and meta-analysis'. Together they form a unique fingerprint.

Cite this