How Much Spatial Information Is Lost in the Sensory Substitution Process? Comparing Visual, Tactile, and Auditory Approaches

Michael Richardson, Jan Thar, James Alvarez, Jan Borchers, Jamie Ward, Giles Hamilton-Fletcher

Research output: Contribution to journalArticlepeer-review

18 Citations (SciVal)

Abstract

Sensory substitution devices (SSDs) can convey visuospatial information through spatialised auditory or tactile stimulation using wearable technology. However, the level of information loss associated with this transformation is unknown. In this study, novice users discriminated the location of two objects at 1.2 m using devices that transformed a 16 × 8-depth map into spatially distributed patterns of light, sound, or touch on the abdomen. Results showed that through active sensing, participants could discriminate the vertical position of objects to a visual angle of 1°, 14°, and 21°, and their distance to 2 cm, 8 cm, and 29 cm using these visual, auditory, and haptic SSDs, respectively. Visual SSDs significantly outperformed auditory and tactile SSDs on vertical localisation, whereas for depth perception, all devices significantly differed from one another (visual > auditory > haptic). Our findings highlight the high level of acuity possible for SSDs even with low spatial resolutions (e.g., 16 × 8) and quantify the level of information loss attributable to this transformation for the SSD user. Finally, we discuss ways of closing this “modality gap” found in SSDs and conclude that this process is best benchmarked against performance with SSDs that return to their primary modality (e.g., visuospatial into visual).
Original languageEnglish
Pages (from-to)1079-1103
Number of pages25
JournalPerception
Volume48
Issue number11
DOIs
Publication statusPublished - 1 Nov 2019

Fingerprint

Dive into the research topics of 'How Much Spatial Information Is Lost in the Sensory Substitution Process? Comparing Visual, Tactile, and Auditory Approaches'. Together they form a unique fingerprint.

Cite this