Back to Search
Start Over
Affinity Regularized Non-Negative Matrix Factorization for Lifelong Topic Modeling.
- Source :
- IEEE Transactions on Knowledge & Data Engineering; Jul2020, Vol. 32 Issue 7, p1249-1262, 14p
- Publication Year :
- 2020
-
Abstract
- Lifelong topic model (LTM), an emerging paradigm for never-ending topic learning, aims to yield higher-quality topics as time passes through knowledge accumulated from the past yet learned for the future. In this paper, we propose a novel lifelong topic model based on non-negative matrix factorization (NMF), called Affinity Regularized NMF for LTM (NMF-LTM), which to our best knowledge is distinctive from the popular LDA-based LTMs. NMF-LTM achieves lifelong learning by introducing word-word graph Laplacian as semantic affinity regularization. Other priors such as sparsity, diversity, and between-class affinity are incorporated as well for better performance, and a theoretical guarantee is provided for the algorithmic convergence to a local minimum. Extensive experiments on various public corpora demonstrate the effectiveness of NMF-LTM, particularly its human-like behaviors in two carefully designed learning tasks and the ability in topic modeling of big data. A further exploration of semantic relatedness in knowledge graphs and a case study on a large-scale real-world corpus exhibit the strength of NMF-LTM in discovering high-quality topics in an efficient and robust way. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10414347
- Volume :
- 32
- Issue :
- 7
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Knowledge & Data Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 143721604
- Full Text :
- https://doi.org/10.1109/TKDE.2019.2904687