UCL  IRIS
Institutional Research Information Service
UCL Logo
Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:

Email: portico-services@ucl.ac.uk

Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Machine Learning Applications in Appearance Modelling
  • Publication Type:
    Thesis/Dissertation
  • Authors:
    Sztrajman A
  • Date awarded:
    2022
  • Awarding institution:
    UCL (University College London)
  • Language:
    English
Abstract
In this thesis, we address multiple applications of machine learning in appearance modelling. We do so by leveraging data-driven approaches, guided through the use of image-based error metrics, to generate new representations of material appearance and scene illumination. We first address the interchange of material appearance between different analytic representations, through an image-based optimisation of BRDF model parameters. We analyse the method in terms of stability with respect to variations of the BRDF parameters, and show that it can be used for material interchange between different renderers and workflows, without the need to access shader implementations. We extend our method to enable the remapping of spatially-varying materials, by presenting two regression schemes that allow us to learn the transformation of parameters between models and apply it to texture maps at fast rates. Next, we centre on the efficient representation and rendering of measured material appearance. We develop a neural-based BRDF representation that provides high-quality reconstruction with low storage and competitive evaluation times, comparable with analytic models. Our method compares favourably against other representations in terms of reconstruction accuracy, and we show that it can be also used to encode anisotropic materials. In addition, we generate a unified encoding of real-world materials via a meta-learning autoencoder architecture guided by a differential rendering loss. This enables the generation of new realistic materials by interpolation of embeddings, and the fast estimation of material properties. We show that this can be leveraged for efficient rendering through importance sampling, by predicting the parameters of an invertible analytic BRDF model. Finally, we design a hybrid representation for high-dynamic-range illumination that combines a convolutional autoencoder-based encoding for low-intensity light, and a parametric model for high intensity. Our model provides a flexible compact encoding for environment maps, while also preserving an accurate reconstruction of the high-intensity component, appropriate for rendering purposes. We utilise our light encodings in a second convolutional neural network trained for light prediction from single outdoor face portrait at interactive rates, with potential applications for real-time light prediction and 3D object insertion.
Publication data is maintained in RPS. Visit https://rps.ucl.ac.uk
 More search options
UCL Researchers
Author
Dept of Computer Science
University College London - Gower Street - London - WC1E 6BT Tel:+44 (0)20 7679 2000

© UCL 1999–2011

Search by