Back to Search Start Over

Empirical study on variational inference methods for topic models.

Authors :
Chi, Jinjin
Ouyang, Jihong
Li, Ximing
Li, Changchun
Source :
Journal of Experimental & Theoretical Artificial Intelligence. Feb2018, Vol. 30 Issue 1, p129-142. 14p.
Publication Year :
2018

Abstract

In topic modelling, the main computational problem is to approximate the posterior distribution given an observed collection. Commonly, we must resort to variational methods for approximations; however, we do not know which variational variant is the best choice under certain settings. In this paper, we focus on four topic modelling inference methods, including mean-field variation Bayesian, collapsed variational Bayesian, hybrid variational-Gibbs and expectation propagation, and aim to systematically compare them. We analyse them from two perspectives, i.e. the approximate posterior distribution and the type of-divergence; and then empirically compare them on various data-sets by two popular metrics. The empirical results are almost matching our analysis, where they indicate that CVB0 may be the best variational variant for topic models. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
0952813X
Volume :
30
Issue :
1
Database :
Academic Search Index
Journal :
Journal of Experimental & Theoretical Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
127071531
Full Text :
https://doi.org/10.1080/0952813X.2017.1409277