Back to Search
Start Over
Univariate Mean Change Point Detection: Penalization, CUSUM and Optimality
- Publication Year :
- 2018
-
Abstract
- The problem of univariate mean change point detection and localization based on a sequence of $n$ independent observations with piecewise constant means has been intensively studied for more than half century, and serves as a blueprint for change point problems in more complex settings. We provide a complete characterization of this classical problem in a general framework in which the upper bound $\sigma^2$ on the noise variance, the minimal spacing $\Delta$ between two consecutive change points and the minimal magnitude $\kappa$ of the changes, are allowed to vary with $n$. We first show that consistent localization of the change points, when the signal-to-noise ratio $\frac{\kappa \sqrt{\Delta}}{\sigma} < \sqrt{\log(n)}$, is impossible. In contrast, when $\frac{\kappa \sqrt{\Delta}}{\sigma}$ diverges with $n$ at the rate of at least $\sqrt{\log(n)}$, we demonstrate that two computationally-efficient change point estimators, one based on the solution to an $\ell_0$-penalized least squares problem and the other on the popular wild binary segmentation algorithm, are both consistent and achieve a localization rate of the order $\frac{\sigma^2}{\kappa^2} \log(n)$. We further show that such rate is minimax optimal, up to a $\log(n)$ term.
- Subjects :
- Mathematics - Statistics Theory
Statistics - Methodology
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1810.09498
- Document Type :
- Working Paper