1. Phase transitions for high dimensional clustering and related problems
- Author
-
Zheng Tracy Ke, Wanjie Wang, and Jiashun Jin
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Phase transition ,Class (set theory) ,Mathematics - Statistics Theory ,Machine Learning (stat.ML) ,Statistics Theory (math.ST) ,010103 numerical & computational mathematics ,01 natural sciences ,Upper and lower bounds ,Data matrix (multivariate statistics) ,Clustering ,comparison of experiments ,Combinatorics ,010104 statistics & probability ,feature selection ,Statistics - Machine Learning ,hypothesis testing ,FOS: Mathematics ,62H25 ,62G05 ,0101 mathematics ,Cluster analysis ,lower bound ,low-rank matrix recovery ,Mathematics ,$L^{1}$-distance ,Colors of noise ,phase transition ,62H30, 62H25 (Primary) 62G05, 62G10 (Secondary) ,Phase space ,Statistics, Probability and Uncertainty ,Random matrix ,62H30 ,62G10 - Abstract
Consider a two-class clustering problem where we observe $X_{i}=\ell_{i}\mu+Z_{i}$, $Z_{i}\stackrel{\mathit{i.i.d.}}{\sim}N(0,I_{p})$, $1\leq i\leq n$. The feature vector $\mu\in R^{p}$ is unknown but is presumably sparse. The class labels $\ell_{i}\in\{-1,1\}$ are also unknown and the main interest is to estimate them. ¶ We are interested in the statistical limits. In the two-dimensional phase space calibrating the rarity and strengths of useful features, we find the precise demarcation for the Region of Impossibility and Region of Possibility. In the former, useful features are too rare/weak for successful clustering. In the latter, useful features are strong enough to allow successful clustering. The results are extended to the case of colored noise using Le Cam’s idea on comparison of experiments. ¶ We also extend the study on statistical limits for clustering to that for signal recovery and that for global testing. We compare the statistical limits for three problems and expose some interesting insight. ¶ We propose classical PCA and Important Features PCA (IF-PCA) for clustering. For a threshold $t>0$, IF-PCA clusters by applying classical PCA to all columns of $X$ with an $L^{2}$-norm larger than $t$. We also propose two aggregation methods. For any parameter in the Region of Possibility, some of these methods yield successful clustering. ¶ We discover a phase transition for IF-PCA. For any threshold $t>0$, let $\xi^{(t)}$ be the first left singular vector of the post-selection data matrix. The phase space partitions into two different regions. In one region, there is a $t$ such that $\cos(\xi^{(t)},\ell)\rightarrow 1$ and IF-PCA yields successful clustering. In the other, $\cos(\xi^{(t)},\ell)\leq c_{0}0$. ¶ Our results require delicate analysis, especially on post-selection random matrix theory and on lower bound arguments.
- Published
- 2017