Back to Search Start Over

A superlinearly convergent norm-relaxed SQP method of strongly sub-feasible directions for constrained optimization without strict complementarity

Authors :
Jian, Jin-bao
Ke, Xiao-yan
Cheng, Wei-xin
Source :
Applied Mathematics & Computation. Aug2009, Vol. 214 Issue 2, p632-644. 13p.
Publication Year :
2009

Abstract

Abstract: In this paper, a kind of optimization problems with nonlinear inequality constraints is discussed. Combined the ideas of norm-relaxed SQP method and strongly sub-feasible direction method as well as a pivoting operation, a new fast algorithm with arbitrary initial point for the discussed problem is presented. At each iteration of the algorithm, an improved direction is obtained by solving only one direction finding subproblem which possesses small scale and always has an optimal solution, and to avoid the Maratos effect, another correction direction is yielded by a simple explicit formula. Since the line search technique can automatically combine the initialization and optimization processes, after finite iterations, the iteration points always get into the feasible set. The proposed algorithm is proved to be globally convergent and superlinearly convergent under mild conditions without the strict complementarity. Finally, some numerical tests are reported. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
00963003
Volume :
214
Issue :
2
Database :
Academic Search Index
Journal :
Applied Mathematics & Computation
Publication Type :
Academic Journal
Accession number :
42963558
Full Text :
https://doi.org/10.1016/j.amc.2009.04.022