Search

Showing total 35 results
35 results

Search Results

1. A nonmonotone scaled conjugate gradient algorithm for large-scale unconstrained optimization.

2. A Line-Search Algorithm Inspired by the Adaptive Cubic Regularization Framework and Complexity Analysis.

3. A nonmonotone quasi-Newton trust-region method of conic model for unconstrained optimization.

4. On the Method of Shortest Residuals for Unconstrained Optimization.

5. A sufficient descent conjugate gradient method and its global convergence.

6. A new nonmonotone filter Barzilai–Borwein method for solving unconstrained optimization problems.

7. A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems.

8. A nonmonotone ODE-based method for unconstrained optimization.

9. Damped techniques for enforcing convergence of quasi-Newton methods.

10. A hybrid of adjustable trust-region and nonmonotone algorithms for unconstrained optimization.

11. An Improved Spectral Conjugate Gradient Algorithm for Nonconvex Unconstrained Optimization Problems.

12. A nonmonotone line search method and its convergence for unconstrained optimization.

13. An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization.

14. A New CG-Algorithm with Self-Scaling VM-Update for Unconstraint Optimization.

15. Combining nonmonotone conic trust region and line search techniques for unconstrained optimization

16. Nonmonotone trust region algorithm for unconstrained optimization problems

17. A nonmonotone globalization algorithm with preconditioned gradient path for unconstrained optimization

18. A descent algorithm without line search for unconstrained optimization

19. Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization.

20. A nonmonotone conic trust region method based on line search for solving unconstrained optimization

21. A conic trust-region method and its convergence properties

22. Convergence of memory gradient methods.

23. Incorporating nonmonotone strategies into the trust region method for unconstrained optimization

24. On the convergence of partitioning group correction algorithms

25. Modification of theWolfe Line Search Rules to Satisfy the Descent Condition in the Polak-Ribière-Polyak Conjugate Gradient Method.

26. A new class of supermemory gradient methods

27. Partitioning group correction Cholesky techniques for large scale sparse unconstrained optimization

28. AN ADAPTIVE NONMONOTONIC TRUST REGION METHOD WITH CURVILINEAR SEARCHES.

29. New quasi-Newton methods for unconstrained optimization problems

30. A NEW STEPSIZE FOR THE STEEPEST DESCENT METHOD.

31. A nonmonotone trust region method for unconstrained optimization

32. Convergence of descent method without line search

33. Nonmonotone adaptive trust-region method for unconstrained optimization problems

34. A new class of memory gradient methods with inexact line searches.

35. A gradient-related algorithm with inexact line searches