Back to Search
Start Over
Partial gradient optimal thresholding algorithms for a class of sparse optimization problems
- Source :
- Journal of Global Optimization. 84:393-413
- Publication Year :
- 2022
- Publisher :
- Springer Science and Business Media LLC, 2022.
-
Abstract
- The optimization problems with a sparsity constraint is a class of important global optimization problems. A typical type of thresholding algorithms for solving such a problem adopts the traditional full steepest descent direction or Newton-like direction as a search direction to generate an iterate on which a certain thresholding is performed. Traditional hard thresholding discards a large part of a vector, and thus some important information contained in a dense vector has been lost in such a thresholding process. Recent study (Zhao in SIAM J Optim 30(1): 31–55, 2020) shows that the hard thresholding should be applied to a compressible vector instead of a dense vector to avoid a big loss of information. On the other hand, the optimal k-thresholding as a novel thresholding technique may overcome the intrinsic drawback of hard thresholding, and performs thresholding and objective function minimization simultaneously. This motivates us to propose the so-called partial gradient optimal thresholding (PGOT) method and its relaxed versions in this paper. The PGOT is an integration of the partial gradient and the optimal k-thresholding technique. The solution error bound and convergence for the proposed algorithms have been established in this paper under suitable conditions. Application of our results to the sparse optimization problems arising from signal recovery is also discussed. Experiment results from synthetic data indicate that the proposed algorithm is efficient and comparable to several existing algorithms.
Details
- ISSN :
- 15732916 and 09255001
- Volume :
- 84
- Database :
- OpenAIRE
- Journal :
- Journal of Global Optimization
- Accession number :
- edsair.doi...........c36c6b467dcfa5a11af180348c766797