Back to Search Start Over

Efficient Adaptive Online Learning via Frequent Directions.

Authors :
Wan, Yuanyu
Zhang, Lijun
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Oct2022, Vol. 44 Issue 10, p6910-6923. 14p.
Publication Year :
2022

Abstract

By employing time-varying proximal functions, adaptive subgradient methods (ADAGRAD) have improved the regret bound and been widely used in online learning and optimization. However, ADAGRAD with full matrix proximal functions (ADA-FULL) cannot handle large-scale problems due to the impractical $O(d^3)$ O (d 3) time and $O(d^2)$ O (d 2) space complexities, though it has better performance when gradients are correlated. In this paper, we propose two efficient variants of ADA-FULL via a matrix sketching technique called frequent directions (FD). The first variant named as ADA-FD directly utilizes FD to maintain and manipulate low-rank matrices, which reduces the space and time complexities to $O(\tau d)$ O (τ d) and $O(\tau ^2d)$ O (τ 2 d) respectively, where $d$ d is the dimensionality and $\tau \ll d$ τ ≪ d is the sketching size. The second variant named as ADA-FFD further adopts a doubling trick to accelerate FD used in ADA-FD, which reduces the average time complexity to $O(\tau d)$ O (τ d) while only doubles the space complexity of ADA-FD. Theoretical analysis reveals that the regret of ADA-FD and ADA-FFD is close to that of ADA-FULL as long as the outer product matrix of gradients is approximately low-rank. Experimental results demonstrate the efficiency and effectiveness of our algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
10
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
159210590
Full Text :
https://doi.org/10.1109/TPAMI.2021.3096880