Back to Search Start Over

First-Order Optimization Inspired from Finite-Time Convergent Flows

Authors :
Zhang, Siqi
Benosman, Mouhacine
Romero, Orlando
Cherian, Anoop
Zhang, Siqi
Benosman, Mouhacine
Romero, Orlando
Cherian, Anoop
Publication Year :
2020

Abstract

In this paper, we investigate the performance of two first-order optimization algorithms, obtained from forward Euler discretization of finite-time optimization flows. These flows are the rescaled-gradient flow (RGF) and the signed-gradient flow (SGF), and consist of non-Lipscthiz or discontinuous dynamical systems that converge locally in finite time to the minima of gradient-dominated functions. We propose an Euler discretization for these first-order finite-time flows, and provide convergence guarantees, in the deterministic and the stochastic setting. We then apply the proposed algorithms to academic examples, as well as deep neural networks training, where we empirically test their performances on the SVHN dataset. Our results show that our schemes demonstrate faster convergences against standard optimization alternatives.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1228436997
Document Type :
Electronic Resource