Back to Search Start Over

PSNA: A pathwise semismooth Newton algorithm for sparse recovery with optimal local convergence and oracle properties.

Authors :
Huang, Jian
Jiao, Yuling
Lu, Xiliang
Shi, Yueyong
Yang, Qinglong
Yang, Yuanyuan
Source :
Signal Processing. May2022, Vol. 194, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• Highlights of revision of "PSNA: A pathwise semismooth Newton algorithm for sparse recovery with optimal local convergence and oracle properties" (SIGPRO-D-20-01795) • In this paper, we propose a pathwise semismooth Newton algorithm (PSNA) for sparse recovery in high-dimensional linear models. PSNA is derived from a formulation of the KKT conditions for Lasso and Enet based on Newton derivatives. It solves the semismooth KKT equations efficiently by actively and continuously seeking the support of the regression coefficients along the solution path with warm start. At each knot in the path, PSNA converges locally superlinearly for the Enet criterion and achieves the best possible convergence rate for the Lasso criterion, i.e., PSNA converges in just one step at the cost of two matrixvector multiplication per iteration. Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients, we show that PSNA hits a solution with the same signs as the regression coefficients and achieves a sharp estimation error bound in finite steps with high probability. Extensive simulation studies support our theoretical results and indicate that PSNA is competitive with or outperforms state-of-the-art Lasso solvers in terms of efficiency and accuracy. • We thank the editor and reviewers for their thorough review of our manuscript and for your careful reading and constructive comments. We have revised our manuscript by incorporating all of your suggestions. Our point-by-point responses to the editor and reviewer comments are included in separate flles. We propose a pathwise semismooth Newton algorithm (PSNA) for sparse recovery in high-dimensional linear models. PSNA is derived from a formulation of the KKT conditions for Lasso and Enet based on Newton derivatives. It solves the semismooth KKT equations efficiently by actively and continuously seeking the support of the regression coefficients along the solution path with warm start. At each knot in the path, PSNA converges locally superlinearly for the Enet criterion and achieves the best possible convergence rate for the Lasso criterion, i.e., PSNA converges in just one step at the cost of two matrix-vector multiplication per iteration. Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients, we show that PSNA hits a solution with the same signs as the regression coefficients and achieves a sharp estimation error bound in finite steps with high probability. Extensive simulation studies support our theoretical results and indicate that PSNA is competitive with or outperforms state-of-the-art Lasso solvers in terms of efficiency and accuracy. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01651684
Volume :
194
Database :
Academic Search Index
Journal :
Signal Processing
Publication Type :
Academic Journal
Accession number :
154946036
Full Text :
https://doi.org/10.1016/j.sigpro.2021.108432