1. Fast Stochastic Ordinal Embedding With Variance Reduction and Adaptive Step Size.
- Author
-
Ma, Ke, Zeng, Jinshan, Xiong, Jiechao, Xu, Qianqian, Cao, Xiaochun, Liu, Wei, and Yao, Yuan
- Subjects
- *
SEMIDEFINITE programming , *MATRIX decomposition , *SYMMETRIC matrices , *SIZE , *EUCLIDEAN distance , *SCALABILITY - Abstract
Learning representation from relative similarity comparisons, often called ordinal embedding, gains rising attention in recent years. Most of the existing methods are based on semi-definite programming (SDP), which is generally time-consuming and degrades the scalability, especially confronting large-scale data. To overcome this challenge, we propose a stochastic algorithm called SVRG-SBB, which has the following features: i) achieving good scalability via dropping positive semi-definite (PSD) constraints as serving a fast algorithm, i.e., stochastic variance reduced gradient (SVRG) method, and ii) adaptive learning via introducing a new, adaptive step size called the stabilized Barzilai-Borwein (SBB) step size. Theoretically, under some natural assumptions, we show the O(1/T) rate of convergence to a stationary point of the proposed algorithm, where T is the number of total iterations. Under the further Polyak-Łojasiewicz assumption, we can show the global linear convergence (i.e., exponentially fast converging to a global optimum) of the proposed algorithm. Numerous simulations and real-world data experiments are conducted to show the effectiveness of the proposed algorithm by comparing with the state-of-the-art methods, notably, much lower computational cost with good prediction performance. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF