News

Supplementary Video


Presentation Video (IEEE VR 2022)


Abstract

Understanding and modeling the dynamics of human gaze behavior in 360º environments is a key challenge in computer vision and virtual reality. Generative adversarial approaches could alleviate this challenge by generating a large number of possible scanpaths for unseen images. Existing methods for scanpath generation, however, do not adequately predict realistic scanpaths for 360º images. We present ScanGAN360, a new generative adversarial approach to address this challenging problem. Our network generator is tailored to the specifics of 360º images representing immersive environments. Specifically, we accomplish this by leveraging the use of a spherical adaptation of dynamic-time warping as a loss function and proposing a novel parameterization of 360º scanpaths. The quality of our scanpaths outperforms competing approaches by a large margin and is almost on par with the human baseline. ScanGAN360 thus allows fast simulation of large numbers of virtual observers, whose behavior mimics real users, enabling a better understanding of gaze behavior and novel applications in virtual scene design.

Downloads

Code

You can find the code and model for ScanGAN360 in our GitHub repository.

Bibtex

@article{martin2022scangan360, title={ScanGAN360: A Generative Model of Realistic Scanpaths for 360 Images}, author={Martin, Daniel and Serrano, Ana and Bergman, Alexander W and Wetzstein, Gordon and Masia, Belen}, journal={IEEE Transactions on Visualization \& Computer Graphics}, number={01}, pages={1--1}, year={2022}, publisher={IEEE Computer Society} }

Related Work