Back to Search Start Over

Learning Deep Generative Models With Doubly Stochastic Gradient MCMC.

Authors :
Du, Chao
Zhu, Jun
Zhang, Bo
Source :
IEEE Transactions on Neural Networks & Learning Systems. Jul2018, Vol. 29 Issue 7, p3084-3096. 13p.
Publication Year :
2018

Abstract

Deep generative models (DGMs), which are often organized in a hierarchical manner, provide a principled framework of capturing the underlying causal factors of data. Recent work on DGMs focussed on the development of efficient and scalable variational inference methods that learn a single model under some mean-field or parameterization assumptions. However, little work has been done on extending Markov chain Monte Carlo (MCMC) methods to Bayesian DGMs, which enjoy many advantages compared with variational methods. We present doubly stochastic gradient MCMC, a simple and generic method for (approximate) Bayesian inference of DGMs in a collapsed continuous parameter space. At each MCMC sampling step, the algorithm randomly draws a mini-batch of data samples to estimate the gradient of log-posterior and further estimates the intractable expectation over hidden variables via a neural adaptive importance sampler, where the proposal distribution is parameterized by a deep neural network and learnt jointly along with the sampling process. We demonstrate the effectiveness of learning various DGMs on a wide range of tasks, including density estimation, data generation, and missing data imputation. Our method outperforms many state-of-the-art competitors. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
29
Issue :
7
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
130351518
Full Text :
https://doi.org/10.1109/TNNLS.2017.2688499