Back to Search
Start Over
A subspace method for large-scale eigenvalue optimization
- Source :
- SIAM Journal on Matrix Analysis and Applications
- Publication Year :
- 2018
-
Abstract
- We consider the minimization or maximization of the Jth largest eigenvalue of an analytic and Hermitian matrix-valued function, and build on Mengi, Yildirim, and Kilic [SIAM T. Matrix Anal. Appl., 35, pp. 699-724, 2014]. This work addresses the setting when the matrix-valued function involved is very large. We describe subspace procedures that convert the original problem into a small-scale one by means of orthogonal projections and restrictions to certain subspaces, and that gradually expand these subspaces based on the optimal solutions of small-scale problems. Global convergence and superlinear rate-of-convergence results with respect to the dimensions of the subspaces are presented in the infinite dimensional setting, where the matrix-valued function is replaced by a compact operator depending on parameters. In practice, it suffices to solve eigenvalue optimization problems involving matrices with sizes on the scale of tens, instead of the original problem involving matrices with sizes on the scale of thousands.<br />OPTEC; Optimization in Engineering Center of KU Leuven; Research Foundation Flanders; project UCoCoS; European Union; European Commision; Scientific and Technological Research Council of Turkey (TÜBİTAK) - FWO (Scientific and Technological Research Council of Turkey (TÜBİTAK) - Belgian Research Foundation, Flanders); BAGEP program of The Science Academy of Turkey
Details
- Database :
- OAIster
- Journal :
- SIAM Journal on Matrix Analysis and Applications
- Notes :
- pdf, English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1200730935
- Document Type :
- Electronic Resource