Back to Search Start Over

Probabilistic inference of Bayesian neural networks with generalized expectation propagation.

Authors :
Zhao, Jing
Liu, Xiao
He, Shaojie
Sun, Shiliang
Source :
Neurocomputing. Oct2020, Vol. 412, p392-398. 7p.
Publication Year :
2020

Abstract

Deep learning plays an important role in the field of machine learning. However, deterministic methods such as neural networks cannot capture the model uncertainty. Bayesian neural network (BNN) are recently under consideration since Bayesian models provide a theoretical framework to infer model uncertainty. Since it is often difficult to find an analytical solution for BNNs, an effective and efficient approximate inference method is very important for model training and prediction. The generalized version of expectation propagation (GEP) was recently proposed and considered a powerful approximate inference method, which is based on the minimization of Kullback–Leibler (KL) divergence of the true posterior and the approximate distributions. In this paper, we further instantiate the GEP to provide an effective and efficient approximate inference method for BNNs. We assess this method on BNNs including fully connected neural networks and convolutional neural networks on multiple benchmark datasets and show a better performance than some state-of-the-art approximate inference methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
412
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
145699486
Full Text :
https://doi.org/10.1016/j.neucom.2020.06.060