Back to Search
Start Over
Alternative methods for parameter estimation in discrete latent variable models
- Publication Year :
- 2022
-
Abstract
- The Expectation-Maximization (EM) algorithm is undoubtedly one of the most widely used techniques to estimate a discrete latent variable (DLV) model. However, while it is possible to prove that this algorithm converges to a local maximum of the log-likelihood function, there is no guarantee of convergence to the global maximum of this function. We propose two modifications to the EM algorithm to tackle this serious problem. The first one incorporates a tempering scheme into the EM algorithm: the log-likelihood is initially flattened to escape local maxima and then warped back to its original shape in a gradual way. The second uses evolutionary computation to encourage more accurate parameter space exploration. The performance of the resulting tempered EM (T-EM) and evolutionary EM (E-EM) algorithms is assessed for latent class and hidden Markov models in terms of both ability to reach the global maximum and computational time; a comparison with the standard EM algorithm is carried out through an extensive Monte Carlo simulation study. We show that the proposed algorithms outperform the standard EM, significantly increasing the chance of reaching the global maximum in almost all the examined cases. This improvement remains considerable, even accounting for the inflated overall computing time.
Details
- Database :
- OAIster
- Notes :
- English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1362213105
- Document Type :
- Electronic Resource