News
- June, 2024: This work has obtained the CEIG 2024 Best Computer and Graphics Paper Award!
- June, 2024: Paper accepted to Computers and Graphics (Proc. CEIG 2024) (see Downloads).
- June, 2024: Demo code available (see Code).
- June, 2024: Web launched.
Abstract
Predicting the path followed by the viewer’s eyes when observing an image (a scanpath) is a challenging problem, particularly due to the inter- and intra-observer variability and the spatio-temporal dependencies of the visual attention process. Most existing approaches have focused on progressively optimizing the prediction of a gaze point given the previous ones. In this work we propose instead a probabilistic approach, which we call tSPM-Net. We build our method to account for observers’ variability by resorting to Bayesian deep learning and a probabilistic approach. Besides, we optimize our model to jointly consider both spatial and temporal dimensions of scanpaths using a novel spatio-temporal loss function based on a combination of Kullback–Leibler divergence and dynamic time warping. Our tSPM-Net yields results that outperform those of current state-of-the-art approaches, and are closer to the human baseline, suggesting that our model is able to generate scanpaths whose behavior closely resembles those of the real ones.
Downloads
Code
You can find the code and model for ScanGAN360 in our GitHub repository.
Bibtex
Related Work
- 2020: Panoramic convolutions for 360º single-image saliency prediction
- 2022: ScanGAN360: A Generative Model of Realistic Scanpaths for 360 Images
- 2020: D-SAV360: A Dataset of Gaze Scanpaths on 360° Ambisonic Videos