Back to Search Start Over

Statistical foundation of Variational Bayes neural networks.

Authors :
Bhattacharya, Shrijita
Maiti, Tapabrata
Source :
Neural Networks. May2021, Vol. 137, p151-173. 23p.
Publication Year :
2021

Abstract

Despite the popularism of Bayesian neural networks (BNNs) in recent years, its use is somewhat limited in complex and big data situations due to the computational cost associated with full posterior evaluations. Variational Bayes (VB) provides a useful alternative to circumvent the computational cost and time complexity associated with the generation of samples from the true posterior using Markov Chain Monte Carlo (MCMC) techniques. The efficacy of the VB methods is well established in machine learning literature. However, its potential broader impact is hindered due to a lack of theoretical validity from a statistical perspective. In this paper, we establish the fundamental result of posterior consistency for the mean-field variational posterior (VP) for a feed-forward artificial neural network model. The paper underlines the conditions needed to guarantee that the VP concentrates around Hellinger neighborhoods of the true density function. Additionally, the role of the scale parameter and its influence on the convergence rates has also been discussed. The paper mainly relies on two results (1) the rate at which the true posterior grows (2) the rate at which the Kullback–Leibler (KL) distance between the posterior and variational posterior grows. The theory provides a guideline for building prior distributions for BNNs along with an assessment of accuracy of the corresponding VB implementation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
137
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
149395866
Full Text :
https://doi.org/10.1016/j.neunet.2021.01.027