1. Extended Variational Message Passing for Automated Approximate Bayesian Inference.
- Author
-
Akbayrak, Semih, Bocharov, Ivan, and de Vries, Bert
- Subjects
- *
BAYESIAN analysis , *CONDITIONAL expectations , *PROBABILISTIC generative models , *DISTRIBUTION (Probability theory) , *EXPONENTIAL families (Statistics) , *ALGORITHMS , *PROBABILISTIC databases - Abstract
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF