Back to Search Start Over

A Primer on Variational Inference for Physics-Informed Deep Generative Modelling

Authors :
Glyn-Davies, Alex
Vadeboncoeur, Arnaud
Akyildiz, O. Deniz
Kazlauskaite, Ieva
Girolami, Mark
Publication Year :
2024

Abstract

Variational inference (VI) is a computationally efficient and scalable methodology for approximate Bayesian inference. It strikes a balance between accuracy of uncertainty quantification and practical tractability. It excels at generative modelling and inversion tasks due to its built-in Bayesian regularisation and flexibility, essential qualities for physics related problems. Deriving the central learning objective for VI must often be tailored to new learning tasks where the nature of the problems dictates the conditional dependence between variables of interest, such as arising in physics problems. In this paper, we provide an accessible and thorough technical introduction to VI for forward and inverse problems, guiding the reader through standard derivations of the VI framework and how it can best be realized through deep learning. We then review and unify recent literature exemplifying the creative flexibility allowed by VI. This paper is designed for a general scientific audience looking to solve physics-based problems with an emphasis on uncertainty quantification.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.06560
Document Type :
Working Paper