12 results
Search Results
2. A long-step feasible predictor-corrector interior-point algorithm for symmetric cone optimization.
- Author
-
Asadi, S., Mansouri, H., Darvay, Zs., Lesaja, G., and Zangiabadi, M.
- Subjects
- *
ALGORITHMS , *PROBLEM solving , *MATHEMATICAL optimization , *ITERATED integrals , *ITERATIVE methods (Mathematics) - Abstract
In this paper, we present a feasible predictor-corrector interior-point method for symmetric cone optimization problem in the large neighbourhood of the central path. The method is generalization of Ai-Zhang's predictor-corrector algorithm to the symmetric cone optimization problem. Starting with a feasible point in given large neighbourhood of the central path, the algorithm still terminates in at most iterations. This matches the best known iteration bound that is usually achieved by short-step methods, thereby, closing the complexity gap between long- and short-step interior-point methods for symmetric cone optimization. The preliminary numerical results on a selected set of NETLIB problems show advantage of the method in comparison with the version of the algorithm that is not based on the predictor-corrector scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
3. A non-monotone linear search algorithm with mixed direction on Stiefel manifold.
- Author
-
Oviedo, Harry, Lara, Hugo, and Dalmau, Oscar
- Subjects
- *
ALGORITHMS , *MATHEMATICAL optimization , *ITERATED integrals , *ITERATIVE methods (Mathematics) , *SINGULAR value decomposition - Abstract
In this paper, we propose a non-monotone line search method for solving optimization problems on Stiefel manifold. The main novelty of our approach is that our method uses a search direction based on a linear combination of descent directions and a Barzilai-Borwein line search. The feasibility is guaranteed by projecting each iterate on the Stiefel manifold through SVD (singular value decomposition) factorizations. Some theoretical results for analysing the algorithm are presented. Finally, we provide numerical experiments for comparing our algorithm with other state-of-the-art procedures. The code is available online. The experimental results show that the proposed algorithm is competitive with other approaches and for particular problems, the computational performance is better than the state-of-the-art algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
4. Pattern search in the presence of degenerate linear constraints.
- Author
-
Abramson, MarkA., Brezhneva, OlgaA., Dennis, J.E., and Pingel, RachaelL.
- Subjects
- *
ALGORITHMS , *CONSTRAINED optimization , *MATHEMATICAL optimization , *ITERATIVE methods (Mathematics) , *LINEAR programming , *PROGRAM transformation , *CONSTRAINT programming - Abstract
This paper deals with generalized pattern search (GPS) algorithms for linearly constrained optimization. At each iteration, the GPS algorithm generates a set of directions that conforms to the geometry of any nearby linear constraints. This set is then used to construct trial points to be evaluated during the iteration. In a previous work, Lewis and Torczon developed a scheme for computing the conforming directions, however, the issue of degeneracy merits further investigation. The contribution of this paper is to provide a detailed algorithm for constructing the set of directions whether or not the constraints are degenerate. One difficulty in the degenerate case is the classification of constraints as redundant or nonredundant. We give a short survey of the main definitions and methods for treating redundancy and propose an approach to identify nonredundant ε-active constraints. We also introduce a new approach for handling nonredundant linearly dependent constraints, which maintains GPS convergence properties without significantly increasing computational cost. Some simple numerical tests illustrate the effectiveness of the algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
5. A nonmonotone ODE-based method for unconstrained optimization.
- Author
-
Ou, Yi-gui
- Subjects
- *
MONOTONE operators , *MATHEMATICAL optimization , *STOCHASTIC convergence , *ALGORITHMS , *ITERATIVE methods (Mathematics) , *COMPUTATIONAL mathematics - Abstract
This paper presents a new hybrid algorithm for unconstrained optimization problems, which combines the idea of the IMPBOT algorithm with the nonmonotone line search technique. A feature of the proposed method is that at each iteration, a system of linear equations is solved only once to obtain a trial step, via a modified limited-memory BFGS two loop recursion that requires only matrix–vector products, thus reducing the computations and storage. Furthermore, when the trial step is not accepted, the proposed method performs a line search along it using a modified nonmonotone scheme, thus a larger stepsize can be yielded in each line search procedure. Under some reasonable assumptions, the convergence properties of the proposed algorithm are analysed. Numerical results are also reported to show the efficiency of this proposed method. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
6. A trust-region algorithm combining line search filter technique for nonlinear constrained optimization.
- Author
-
Pei, Yonggang and Zhu, Detong
- Subjects
- *
ALGORITHMS , *NONLINEAR programming , *MATHEMATICAL optimization , *BACKTRACK programming , *STOCHASTIC convergence , *ITERATIVE methods (Mathematics) - Abstract
In this paper, we propose a trust-region algorithm in association with line search filter technique for solving nonlinear equality constrained programming. At current iteration, a trial step is formed as the sum of a normal step and a tangential step which is generated by trust-region subproblem and the step size is decided by interior backtracking line search together with filter methods. Then, the next iteration is determined. This is different from general trust-region methods in which the next iteration is determined by the ratio of the actual reduction to the predicted reduction. The global convergence analysis for this algorithm is presented under some reasonable assumptions and the preliminary numerical results are reported. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
7. An interesting characteristic of phase-1 of dual–primal algorithm for linear programming.
- Author
-
Li, Haohao
- Subjects
- *
DUALITY theory (Mathematics) , *ALGORITHMS , *LINEAR programming , *PROBLEM solving , *MATHEMATICAL optimization , *MATHEMATICAL proofs , *ITERATIVE methods (Mathematics) - Abstract
Most algorithms for solving linear program require a phase-1 procedure to find a feasible solution. Recently, a dual–primal algorithm for linear optimization has been proposed by Li [Dual–primal algorithm for linear optimization, Optimiz. Methods Softw. 28 (2013), pp. 327–338]. In the process of implementing the dual–primal algorithms, we found an interesting phenomenon that the phase-1 algorithm developed in [Li (2013)] always terminates in one iteration. This fact does not come by chance. A rigorous proof is given in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
8. A full-step interior-point algorithm for second-order cone optimization based on a simple locally kernel function.
- Author
-
Zhang, Lipu, Bai, Yanqin, and Xu, Yinghong
- Subjects
- *
ALGORITHMS , *MATHEMATICAL optimization , *KERNEL functions , *ITERATIVE methods (Mathematics) , *MATHEMATICAL bounds , *COMPLEX variables - Abstract
In this paper, we investigate the properties of a simple locally kernel function. As an application, we present a full-step interior-point algorithm for second-order cone optimization (SOCO). The algorithm uses the simple locally kernel function to determine the search direction and define the neighbourhood of central path. The full step used in the algorithm has local quadratic convergence property according to the proximity function which is constructed by the simple locally kernel function. We derive the iteration complexity for the algorithm and obtain the best-known iteration bound for SOCO. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
9. A symmetric rank-one quasi-Newton line-search method using negative curvature directions.
- Author
-
Öztoprak, Figen and Birbil, Ş.İlker
- Subjects
- *
MATHEMATICAL symmetry , *ITERATIVE methods (Mathematics) , *MATHEMATICAL optimization , *APPROXIMATION theory , *NUMERICAL analysis , *STOCHASTIC convergence , *ALGORITHMS - Abstract
We propose a quasi-Newton line-search method that uses negative curvature directions for solving unconstrained optimization problems. In this method, the symmetric rank-one (SR1) rule is used to update the Hessian approximation. The SR1 update rule is known to have a good numerical performance; however, it does not guarantee positive definiteness of the updated matrix. We first discuss the details of the proposed algorithm and then concentrate on its practical behaviour. Our extensive computational study shows the potential of the proposed method from different angles, such as its performance compared with some other existing packages, the profile of its computations, and its large-scale adaptation. We then conclude the paper with the convergence analysis of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
10. A Trust Region Algorithm with Memory for Equality Constrained Optimization.
- Author
-
Yu, Zhensheng, Zhang, Weiguo, and Lin, Ji
- Subjects
- *
ALGORITHMS , *MATHEMATICAL optimization , *NUMERICAL analysis , *MATHEMATICAL functions , *ITERATIVE methods (Mathematics) - Abstract
In this paper, we present a trust region algorithm with memory for equality constrained optimization problems. Different from the traditional trust region algorithms, our trust region model includes memory of the past iterations, which makes the algorithm more farsighted in the sense that its behavior is not completely dominated by the local nature of the objective function, but rather by a more global view. The global convergence is established by using a nonmonotone technique. We report numerical tests to examine the effectiveness of the algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
11. A practical update criterion for SQP method.
- Author
-
Liu, Tao-Wen and Li, Dong-Hui
- Subjects
- *
MATHEMATICAL optimization , *ALGORITHMS , *NEWTON-Raphson method , *LAGRANGIAN functions , *ITERATIVE methods (Mathematics) - Abstract
Sequential quadratic programming (SQP) algorithms have proven to be efficient for solving nonlinearly constrained optimization problems under some conditions. In analysis of global convergence for SQP algorithms, it is usually assumed that the quasi-Newton matrices, approximating the Hessian of Lagrangian function, are uniformly positive definite. However, it is not known whether this condition is satisfied for quasi-Newton matrices. In this paper, we present a new update criterion for the quasi-Newton matrix and a new update method for the penalty parameter. We establish the global convergence result of a modifying SQP algorithm for equality constrained optimization problems without any assumption on quasi-Newton matrix sequence. Moreover, if the second-order sufficient condition holds at an optimal solution, the new update criterion reduces to the ordinary BFGS update. The superlinear rate of convergence of the algorithm can be obtained if an additional step is performed. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
12. On the construction of some functional inequalities via ECGM algorithm.
- Author
-
Ibiejugba, M. A. and Adegbie, K. S.
- Subjects
- *
CONJUGATE gradient methods , *ALGORITHMS , *APPROXIMATION theory , *COMPUTERS , *MATHEMATICAL optimization , *ITERATIVE methods (Mathematics) - Abstract
The development of optimization theory originated with economic requirements and problems, where optimal strategy was to be determined mathematically. At about the same time, approximation theory, which was already well developed, experienced a reinvigoration brought about by the advent of electronic computers. In this paper, what follows, we recall the functional analysis that constitutes the framework of our development of an extended conjugate gradient algorithm that does not involve any approximation in any of its steps. This is a computational enhancement over the conventional conjugate gradient method which is dependent on some approximation theory. We use this improved algorithm for the construction of some functional inequalities. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.