Back to Search Start Over

Efficient Low-Rank Semidefinite Programming With Robust Loss Functions.

Authors :
Yao, Quanming
Yang, Hansi
Hu, En-Liang
Kwok, James T.
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Oct2022, Vol. 44 Issue 10, p6153-6168. 16p.
Publication Year :
2022

Abstract

In real-world applications, it is important for machine learning algorithms to be robust against data outliers or corruptions. In this paper, we focus on improving the robustness of a large class of learning algorithms that are formulated as low-rank semi-definite programming (SDP) problems. Traditional formulations use the square loss, which is notorious for being sensitive to outliers. We propose to replace this with more robust noise models, including the $\ell _1$ ℓ 1 -loss and other nonconvex losses. However, the resultant optimization problem becomes difficult as the objective is no longer convex or smooth. To alleviate this problem, we design an efficient algorithm based on majorization-minimization. The crux is on constructing a good optimization surrogate, and we show that this surrogate can be efficiently obtained by the alternating direction method of multipliers (ADMM). By properly monitoring ADMM's convergence, the proposed algorithm is empirically efficient and also theoretically guaranteed to converge to a critical point. Extensive experiments are performed on four machine learning applications using both synthetic and real-world data sets. Results show that the proposed algorithm is not only fast but also has better performance than the state-of-the-arts. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
10
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
159210529
Full Text :
https://doi.org/10.1109/TPAMI.2021.3085858