News

Abstract

Intuitively editing the appearance of materials from a single image is a challenging task given the complexity of the interactions between light and matter, and the ambivalence of human perception. This problem has been traditionally addressed by estimating additional factors of the scene like geometry or illumination, thus solving an inverse rendering problem and subduing the final quality of the results to the quality of these estimations. We present a single-image appearance editing framework that allows us to intuitively modify the material appearance of an object by increasing or decreasing high-level perceptual attributes describing such appearance (e.g., glossy or metallic). Our framework takes as input an in-the-wild image of a single object, where geometry, material, and illumination are not controlled, and inverse rendering is not required. We rely on generative models and devise a novel architecture with Selective Transfer Unit (STU) cells that allow to preserve the high-frequency details from the input image in the edited one. To train our framework we leverage a dataset with pairs of synthetic images rendered with physically-based algorithms, and the corresponding crowd-sourced ratings of high-level perceptual attributes. We show that our material editing framework outperforms the state of the art, and showcase its applicability on synthetic images, in-the-wild real-world photographs, and video sequences.

Downloads

Bibtex

@article {subias23in-the-wild_editing, journal = {Computer Graphics Forum}, title = {{In-the-wild Material Appearance Editing using Perceptual Attributes}}, author = {Subías, José Daniel and Lagunas, Manuel}, year = {2023}, publisher = {The Eurographics Association and John Wiley & Sons Ltd.}, ISSN = {1467-8659}, DOI = {10.1111/cgf.14765} }

Related

Acknowledgements

This project has received funding from the Government of Aragon’s Departamento de Ciencia, Universidad y Sociedad del Conocimiento through the Reference Research Group "Graphics and Imaging Lab", the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No 956585 (PRIME), the CHAMELEON project (European Union’s Horizon 2020, European Research Council, grant agreement No. 682080), and MCIN/AEI 10.13039/501100011033 through Project PID2019-105004GB-I00.