Abstract: Although the Liu–Storey (LS) nonlinear conjugate gradient method has a similar structure as the well-known Polak–Ribière–Polyak (PRP) and Hestenes–Stiefel (HS) methods, research about this method is very rare. In this paper, based on the memoryless BFGS quasi-Newton method, we propose a new LS type method, which converges globally for general functions with the Grippo–Lucidi line search. Moreover, we modify this new LS method such that the modified scheme is globally convergent for nonconvex minimization if the strong Wolfe line search is used. Numerical results are also reported. [Copyright &y& Elsevier]
Abstract: In this paper, we propose a new trust region method for unconstrained optimization problems. The new trust region method can automatically adjust the trust region radius of related subproblems at each iteration and has strong global convergence under some mild conditions. We also analyze the global linear convergence, local superlinear and quadratic convergence rate of the new method. Numerical results show that the new trust region method is available and efficient in practical computation. [Copyright &y& Elsevier]