A primer on variational inference for physics-informed deep generative modelling
OA Location
Author(s)
Glyn-Davies, Alex
Vadeboncoeur, Arnaud
Akyildiz, Omer Deniz
Kazlauskaite, Ieva
Girolami, Mark
Type
Journal Article
Abstract
Variational inference (VI) is a computationally efficient and scalable methodology for approximate Bayesian inference. It strikes a balance between accuracy of uncertainty quantification and practical tractability. It excels at generative modelling and inversion tasks due to its built-in Bayesian regularisation and flexibility, essential qualities for physics related problems. For such problems, the underlying physical model determines the dependence between variables of interest, which in turn will require a tailored derivation for the central VI learning objective. Furthermore, in many physical inference applications this structure has rich meaning and is essential for accurately capturing the dynamics of interest. In this paper, we provide an accessible and thorough technical introduction to VI for forward and inverse problems, guiding the reader through standard derivations of the VI framework and how it can best be realized through deep learning. We then review and unify recent literature exemplifying the flexibility allowed by VI. This paper is designed for a general scientific audience looking to solve physics-based problems with an emphasis on uncertainty quantification.
Date Acceptance
2025-03-24
Citation
Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
ISSN
1364-503X
Publisher
The Royal Society
Journal / Book Title
Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
Copyright Statement
Subject to copyright.
Publication Status
Accepted