Back to Search Start Over

Learning Model-Based Sparsity via Projected Gradient Descent.

Authors :
Bahmani, Sohail
Boufounos, Petros T.
Raj, Bhiksha
Source :
IEEE Transactions on Information Theory. Apr2016, Vol. 62 Issue 4, p2092-2099. 8p.
Publication Year :
2016

Abstract

Several convex formulation methods have been proposed previously for statistical estimation with structured sparsity as the prior. These methods often require a carefully tuned regularization parameter, often a cumbersome or heuristic exercise. Furthermore, the estimate that these methods produce might not belong to the desired sparsity model, albeit accurately approximating the true parameter. Therefore, greedy-type algorithms could often be more desirable in estimating structured-sparse parameters. So far, these greedy methods have mostly focused on linear statistical models. In this paper, we study the projected gradient descent with a non-convex structured-sparse parameter model as the constraint set. Should the cost function have a stable model-restricted Hessian, the algorithm produces an approximation for the desired minimizer. As an example, we elaborate on application of the main results to estimation in generalized linear models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
62
Issue :
4
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
113872616
Full Text :
https://doi.org/10.1109/TIT.2016.2515078