News

Abstract

This work presents a perceptually-motivated manifold for translucent appearance, designed for intuitive editing of translucent materials by navigating through the manifold. Classic tools for editing translucent appearance, based on the use of sliders to tune a number of parameters, are challenging for non-expert users: these parameters have a highly non-linear effect on appearance, and exhibit complex interplay and similarity relations between them. Instead, we pose editing as a navigation task in a low-dimensional space of appearances, which abstracts the user from the underlying optical parameters. To achieve this, we build a low-dimensional continuous manifold of translucent appearances that correlates with how humans perceive this type of material. We first analyze the correlation of different distance metrics in image space with human perception. We select the best-performing metric to build a low-dimensional manifold, which can be used to navigate the space of translucent appearance. To evaluate the validity of our proposed manifold within its intended application scenario, we build an editing interface that leverages the manifold, and relies on image navigation plus a fine-tuning step to edit appearance. We compare our intuitive interface to a traditional, slider-based one in a user study, demonstrating its effectiveness and superior performance when editing translucent objects.

Downloads

Bibtex

@article {lanza2024NavigatingTranslucent, journal = {Computer Graphics Forum}, title = {{Navigating the Manifold of Translucent Appearance}}, author = {Lanza, Dario and Masia, Belen and Jarabo, Adrian}, year = {2024}, publisher = {The Eurographics Association and John Wiley & Sons Ltd.}, ISSN = {1467-8659}, DOI = {10.1111/cgf.15035} }

Acknowledgements

This work is funded by the European Union's Horizon 2020 research and innovation program, the European Union's Horizon 2020 research and innovation program through the PRIME project (MSCA-ITN, grant agreement No. 956585) and MCIN/ AEI 10.13039/501100011033 through Project PID2019-105004GB-I00.