Back to Search Start Over

Pseudo-document simulation for comparing LDA, GSDMM and GPM topic models on short and sparse text using Twitter data.

Authors :
Weisser, Christoph
Gerloff, Christoph
Thielmann, Anton
Python, Andre
Reuter, Arik
Kneib, Thomas
Säfken, Benjamin
Source :
Computational Statistics. Jun2023, Vol. 38 Issue 2, p647-674. 28p.
Publication Year :
2023

Abstract

Topic models are a useful and popular method to find latent topics of documents. However, the short and sparse texts in social media micro-blogs such as Twitter are challenging for the most commonly used Latent Dirichlet Allocation (LDA) topic model. We compare the performance of the standard LDA topic model with the Gibbs Sampler Dirichlet Multinomial Model (GSDMM) and the Gamma Poisson Mixture Model (GPM), which are specifically designed for sparse data. To compare the performance of the three models, we propose the simulation of pseudo-documents as a novel evaluation method. In a case study with short and sparse text, the models are evaluated on tweets filtered by keywords relating to the Covid-19 pandemic. We find that standard coherence scores that are often used for the evaluation of topic models perform poorly as an evaluation metric. The results of our simulation-based approach suggest that the GSDMM and GPM topic models may generate better topics than the standard LDA model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09434062
Volume :
38
Issue :
2
Database :
Academic Search Index
Journal :
Computational Statistics
Publication Type :
Academic Journal
Accession number :
163849942
Full Text :
https://doi.org/10.1007/s00180-022-01246-z