Abstract

Perception of distances in virtual reality (VR) is compressed: objects are consistently perceived as closer than intended. Although this phenomenon has been well documented, it is still not fully understood or defined with respect to the factors influencing such compression. This is a problem in scenarios where veridical perception of distance and scale is essential. We report the results of an experiment investigating an approach to reducing distance compression in audiovisual VR based on a predictive model of distance perception. Our test environment involved photorealistic 3D images captured through stereo photography, with corresponding spatial audio rendered binaurally over headphones. In a perceptual matching task, participants positioned an auditory stimulus with respect to the corresponding visual stimulus. We found a high correlation between the distance perception predicted by our model and how participants perceived the distance. Through automated manipulation of the audio and visual displays based on the model, our approach can be used to reposition auditory and visual components of a scene to reduce distance compression. The approach is adaptable to different environments and agnostic of scene content, and can be calibrated to individual observers.
LanguageEnglish
Title of host publication2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)
PublisherIEEE
ISBN (Print)9781538604595
DOIs
StatusPublished - 19 Mar 2017

Fingerprint

Virtual reality
Headphones
Photography
Display devices
Experiments

Cite this

Finnegan, D., O'Neill, E., & Proulx, M. (2017). An Approach to Reducing Distance Compression in Audiovisual Virtual Environments. In 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) [7901607] IEEE. DOI: 10.1109/SIVE.2017.7901607

An Approach to Reducing Distance Compression in Audiovisual Virtual Environments. / Finnegan, Daniel; O'Neill, Eamonn; Proulx, Michael.

2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) . IEEE, 2017. 7901607.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Finnegan, D, O'Neill, E & Proulx, M 2017, An Approach to Reducing Distance Compression in Audiovisual Virtual Environments. in 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) ., 7901607, IEEE. DOI: 10.1109/SIVE.2017.7901607
Finnegan D, O'Neill E, Proulx M. An Approach to Reducing Distance Compression in Audiovisual Virtual Environments. In 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) . IEEE. 2017. 7901607. Available from, DOI: 10.1109/SIVE.2017.7901607
Finnegan, Daniel ; O'Neill, Eamonn ; Proulx, Michael. / An Approach to Reducing Distance Compression in Audiovisual Virtual Environments. 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE) . IEEE, 2017.
@inproceedings{9a47ad37111e4d7fa845f3a3b5b19cfd,
title = "An Approach to Reducing Distance Compression in Audiovisual Virtual Environments",
abstract = "Perception of distances in virtual reality (VR) is compressed: objects are consistently perceived as closer than intended. Although this phenomenon has been well documented, it is still not fully understood or defined with respect to the factors influencing such compression. This is a problem in scenarios where veridical perception of distance and scale is essential. We report the results of an experiment investigating an approach to reducing distance compression in audiovisual VR based on a predictive model of distance perception. Our test environment involved photorealistic 3D images captured through stereo photography, with corresponding spatial audio rendered binaurally over headphones. In a perceptual matching task, participants positioned an auditory stimulus with respect to the corresponding visual stimulus. We found a high correlation between the distance perception predicted by our model and how participants perceived the distance. Through automated manipulation of the audio and visual displays based on the model, our approach can be used to reposition auditory and visual components of a scene to reduce distance compression. The approach is adaptable to different environments and agnostic of scene content, and can be calibrated to individual observers.",
author = "Daniel Finnegan and Eamonn O'Neill and Michael Proulx",
year = "2017",
month = "3",
day = "19",
doi = "10.1109/SIVE.2017.7901607",
language = "English",
isbn = "9781538604595",
booktitle = "2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)",
publisher = "IEEE",
address = "USA United States",

}

TY - GEN

T1 - An Approach to Reducing Distance Compression in Audiovisual Virtual Environments

AU - Finnegan,Daniel

AU - O'Neill,Eamonn

AU - Proulx,Michael

PY - 2017/3/19

Y1 - 2017/3/19

N2 - Perception of distances in virtual reality (VR) is compressed: objects are consistently perceived as closer than intended. Although this phenomenon has been well documented, it is still not fully understood or defined with respect to the factors influencing such compression. This is a problem in scenarios where veridical perception of distance and scale is essential. We report the results of an experiment investigating an approach to reducing distance compression in audiovisual VR based on a predictive model of distance perception. Our test environment involved photorealistic 3D images captured through stereo photography, with corresponding spatial audio rendered binaurally over headphones. In a perceptual matching task, participants positioned an auditory stimulus with respect to the corresponding visual stimulus. We found a high correlation between the distance perception predicted by our model and how participants perceived the distance. Through automated manipulation of the audio and visual displays based on the model, our approach can be used to reposition auditory and visual components of a scene to reduce distance compression. The approach is adaptable to different environments and agnostic of scene content, and can be calibrated to individual observers.

AB - Perception of distances in virtual reality (VR) is compressed: objects are consistently perceived as closer than intended. Although this phenomenon has been well documented, it is still not fully understood or defined with respect to the factors influencing such compression. This is a problem in scenarios where veridical perception of distance and scale is essential. We report the results of an experiment investigating an approach to reducing distance compression in audiovisual VR based on a predictive model of distance perception. Our test environment involved photorealistic 3D images captured through stereo photography, with corresponding spatial audio rendered binaurally over headphones. In a perceptual matching task, participants positioned an auditory stimulus with respect to the corresponding visual stimulus. We found a high correlation between the distance perception predicted by our model and how participants perceived the distance. Through automated manipulation of the audio and visual displays based on the model, our approach can be used to reposition auditory and visual components of a scene to reduce distance compression. The approach is adaptable to different environments and agnostic of scene content, and can be calibrated to individual observers.

UR - https://doi.org/10.1109/SIVE.2017.7901607

U2 - 10.1109/SIVE.2017.7901607

DO - 10.1109/SIVE.2017.7901607

M3 - Conference contribution

SN - 9781538604595

BT - 2017 IEEE 3rd VR Workshop on Sonic Interactions for Virtual Environments (SIVE)

PB - IEEE

ER -