Back to Search Start Over

Weak Separation in Mixture Models and Implications for Principal Stratification

Authors :
Feller, Avi
Greif, Evan
Ho, Nhat
Miratrix, Luke
Pillai, Natesh
Source :
Grantee Submission. 2019.
Publication Year :
2019

Abstract

Principal stratification is a widely used framework for addressing post-randomization complications. After using principal stratification to define causal effects of interest, researchers are increasingly turning to finite mixture models to estimate these quantities. Unfortunately, standard estimators of mixture parameters, like the MLE, are known to exhibit pathological behavior. We study this behavior in a simple but fundamental example, a two-component Gaussian mixture model in which only the component means and variances are unknown, and focus on the setting in which the components are weakly separated. In this case, we show that the asymptotic convergence rate of the MLE is quite poor, such as O(n[superscript -1/6]) or even O(n[superscript -1/8]). We then demonstrate via theoretical arguments as well as extensive simulations that, in finite samples, the MLE behaves like a threshold estimator, in the sense that the MLE can give strong evidence that the means are equal when the truth is otherwise. We also explore the behavior of the MLE when the MLE is non-zero, showing that it is difficult to estimate both the sign and magnitude of the means in this case. We provide diagnostics for all of these pathologies and apply these ideas to re-analyzing two randomized evaluations of job training programs, JOBS II and Job Corps. Our results suggest that the corresponding maximum likelihood estimates should be interpreted with caution in these cases.

Details

Language :
English
Database :
ERIC
Journal :
Grantee Submission
Publication Type :
Report
Accession number :
ED599271
Document Type :
Reports - Descriptive