Teaser

News

Abstract

Humans perceive the world by integrating multimodal sensory feedback, including visual and auditory stimuli, which holds true in virtual reality (VR) environments. Proper synchronization of these stimuli is crucial for perceiving a coherent and immersive VR experience. In this work, we focus on the interplay between audio and vision during localization tasks involving natural head-body rotations. We explore the impact of audio-visual offsets and rotation velocities on users' directional localization acuity for various viewing modes. Using psychometric functions, we model perceptual disparities between visual and auditory cues and determine offset detection thresholds. Our findings reveal that target localization accuracy is affected by perceptual audio-visual disparities during head-body rotations, but remains consistent in the absence of stimuli-head relative motion. We then showcase the effectiveness of our approach in predicting and enhancing users' localization accuracy within realistic VR gaming applications. To provide additional support for our findings, we implement a natural VR game wherein we apply a compensatory audio-visual offset derived from our measured psychometric functions. As a result, we demonstrate a substantial improvement of up to 40% in participants' target localization accuracy. We additionally provide guidelines for content creation to ensure coherent and seamless VR experiences.

Supplementary Video

PDF

Paper (authors' version): PDF

Code and Data

You can find the our code and data in our GitHub repository.

Bibtex

@article{Bernal-Berdun2024spatial, author={Bernal-Berdun, Edurne and Vallejo, Mateo and Sun, Qi and Serrano, Ana and Gutierrez, Diego}, journal={IEEE Transactions on Visualization and Computer Graphics}, title={Modeling the Impact of Head-Body Rotations on Audio-Visual Spatial Perception for Virtual Reality Applications}, year={2024}, pages={1-9}, doi={10.1109/TVCG.2024.3372112}}