Back to Search
Start Over
Gradient methods with memory.
- Source :
- Optimization Methods & Software; Jun2022, Vol. 37 Issue 3, p936-953, 18p
- Publication Year :
- 2022
-
Abstract
- In this paper, we consider gradient methods for minimizing smooth convex functions, which employ the information obtained at the previous iterations in order to accelerate the convergence towards the optimal solution. This information is used in the form of a piece-wise linear model of the objective function, which provides us with much better prediction abilities as compared with the standard linear model. To the best of our knowledge, this approach was never really applied in Convex Minimization to differentiable functions in view of the high complexity of the corresponding auxiliary problems. However, we show that all necessary computations can be done very efficiently. Consequently, we get new optimization methods, which are better than the usual Gradient Methods both in the number of oracle calls and in the computational time. Our theoretical conclusions are confirmed by preliminary computational experiments. [ABSTRACT FROM AUTHOR]
- Subjects :
- SMOOTHNESS of functions
CONVEX functions
MEMORY
Subjects
Details
- Language :
- English
- ISSN :
- 10556788
- Volume :
- 37
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Optimization Methods & Software
- Publication Type :
- Academic Journal
- Accession number :
- 159583372
- Full Text :
- https://doi.org/10.1080/10556788.2020.1858831