News

Abstract

Predicting the path followed by the viewer’s eyes when observing an image (a scanpath) is a challenging problem, particularly due to the inter- and intra-observer variability and the spatio-temporal dependencies of the visual attention process. Most existing approaches have focused on progressively optimizing the prediction of a gaze point given the previous ones. In this work we propose instead a probabilistic approach, which we call tSPM-Net. We build our method to account for observers’ variability by resorting to Bayesian deep learning and a probabilistic approach. Besides, we optimize our model to jointly consider both spatial and temporal dimensions of scanpaths using a novel spatio-temporal loss function based on a combination of Kullback–Leibler divergence and dynamic time warping. Our tSPM-Net yields results that outperform those of current state-of-the-art approaches, and are closer to the human baseline, suggesting that our model is able to generate scanpaths whose behavior closely resembles those of the real ones.

Downloads

Code

You can find the code and model for ScanGAN360 in our GitHub repository.

Bibtex

@article{martin2024tspm, title={tSPM-Net: A probabilistic spatio-temporal approach for scanpath prediction}, author={Martin, Daniel and Gutierrez, Diego and Masia, Belen}, journal={Computers \& Graphics}, pages={103983}, year={2024}, publisher={Elsevier} }

Related Work



This work has been supported by grant PID2022-141539NB-I00, funded by MICIU/AEI/10.13039/501100011033 and by ERDF, EU