Back to Search Start Over

Fast Stochastic Ordinal Embedding With Variance Reduction and Adaptive Step Size.

Authors :
Ma, Ke
Zeng, Jinshan
Xiong, Jiechao
Xu, Qianqian
Cao, Xiaochun
Liu, Wei
Yao, Yuan
Source :
IEEE Transactions on Knowledge & Data Engineering. Jun2021, Vol. 33 Issue 6, p2467-2478. 12p.
Publication Year :
2021

Abstract

Learning representation from relative similarity comparisons, often called ordinal embedding, gains rising attention in recent years. Most of the existing methods are based on semi-definite programming (SDP), which is generally time-consuming and degrades the scalability, especially confronting large-scale data. To overcome this challenge, we propose a stochastic algorithm called SVRG-SBB, which has the following features: i) achieving good scalability via dropping positive semi-definite (PSD) constraints as serving a fast algorithm, i.e., stochastic variance reduced gradient (SVRG) method, and ii) adaptive learning via introducing a new, adaptive step size called the stabilized Barzilai-Borwein (SBB) step size. Theoretically, under some natural assumptions, we show the O(1/T) rate of convergence to a stationary point of the proposed algorithm, where T is the number of total iterations. Under the further Polyak-Ɓojasiewicz assumption, we can show the global linear convergence (i.e., exponentially fast converging to a global optimum) of the proposed algorithm. Numerous simulations and real-world data experiments are conducted to show the effectiveness of the proposed algorithm by comparing with the state-of-the-art methods, notably, much lower computational cost with good prediction performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10414347
Volume :
33
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Knowledge & Data Engineering
Publication Type :
Academic Journal
Accession number :
150287536
Full Text :
https://doi.org/10.1109/TKDE.2019.2956700