A primer on variational inference for physics-informed deep generative modelling

Glyn-Davies, A., Vadeboncoeur, A., Akyildiz, O. D., Kazlauskaite, I.ORCID logo & Girolami, M. (2025). A primer on variational inference for physics-informed deep generative modelling. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 383(2299). https://doi.org/10.1098/rsta.2024.0324
Copy

Variational inference (VI) is a computationally efficient and scalable methodology for approximate Bayesian inference. It strikes a balance between accuracy of uncertainty quantification and practical tractability. It excels at generative modelling and inversion tasks due to its built-in Bayesian regularization and flexibility, essential qualities for physics-related problems. For such problems, the underlying physical model determines the dependence between variables of interest, which in turn will require a tailored derivation for the central VI learning objective. Furthermore, in many physical inference applications, this structure has rich meaning and is essential for accurately capturing the dynamics of interest. In this paper, we provide an accessible and thorough technical introduction to VI for forward and inverse problems, guiding the reader through standard derivations of the VI framework and how it can best be realized through deep learning. We then review and unify recent literature exemplifying the flexibility allowed by VI. This paper is designed for a general scientific audience looking to solve physics-based problems with an emphasis on uncertainty quantification. This article is part of the theme issue ‘Generative modelling meets Bayesian inference: a new paradigm for inverse problems’.

picture_as_pdf

subject
Published Version
Creative Commons: Attribution 4.0

Download

Export as

EndNote BibTeX Reference Manager Refer Atom Dublin Core JSON Multiline CSV
Export