Sufficient descent condition is very crucial in establishing the global convergence of nonlinear conjugate gradient method. In this paper, we modified two conjugate gradient methods such that both methods satisfy this property. Under suitable conditions, we prove the global convergence of the proposed methods. Numerical results show that the proposed methods are efficient for the given test problems. [ABSTRACT FROM PUBLISHER]
Abstract: In this paper, we consider the Polak–Ribière (or Polak–Ribière plus) conjugate gradient method for solving optimality condition of an unconstrained minimization problem. We give two new steplength rules only using gradient, and under gradient-Lipschitz assumption prove this method’s global convergence correspondingly. Then, we develop a practical Polak–Ribière plus method whose steplength is located by one inequality only using gradient, and report promising numerical results on high accuracy solution for some standard test problems when compared to the state-of-art methods in this research direction. Importantly, our work provides a new idea of devising a practical version of the celebrated Polak–Ribière (or Polak–Ribière plus) method. [Copyright &y& Elsevier]
Abstract: Nonlinear conjugate gradient (CG) methods have played an important role in solving large-scale unconstrained optimization. Their wide application in many fields is due to their low memory requirements and global convergence properties. Numerous studies and modifications have been conducted recently to improve this method. In this paper, a new class of conjugate gradient coefficients (βk ) that possess global convergence properties is presented. The global convergence result is established using exact line searches. Numerical result shows that the proposed formula is superior and more efficient when compared to other CG coefficients. [Copyright &y& Elsevier]