Results 1  10
of
10
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING FOR OPTIMIZATION AND A STABILIZED NEWTONTYPE METHOD FOR VARIATIONAL PROBLEMS WITHOUT CONSTRAINT QUALIFICATIONS
, 2007
"... The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence ..."
Abstract

Cited by 20 (13 self)
 Add to MetaCart
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence of sSQP had been previously established under the secondorder sufficient condition for optimality (SOSC) and the MangasarianFromovitz constraint qualification, or under the strong secondorder sufficient condition for optimality (in that case, without constraint qualification assumptions). We prove a stronger superlinear convergence result than the above, assuming SOSC only. In addition, our analysis is carried out in the more general setting of variational problems, for which we introduce a natural extension of sSQP techniques. In the process, we also obtain a new error bound for KarushKuhnTucker systems for variational problems.
SHARP PRIMAL SUPERLINEAR CONVERGENCE RESULTS FOR SOME NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION
, 2009
"... As is well known, superlinear or quadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply superlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQ ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
As is well known, superlinear or quadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply superlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQP) algorithm, local primaldual quadratic convergence can be established under the assumptions of uniqueness of the Lagrange multiplier associated to the solution and the secondorder sufficient condition. At the same time, previous primal superlinear convergence results for SQP required to strengthen the first assumption to the linear independence constraint qualification. In this paper, we show that this strengthening of assumptions is actually not necessary. Specifically, we show that once primaldual convergence is assumed or already established, for primal superlinear rate one only needs a certain error bound estimate. This error bound holds, for example, under the secondorder sufficient condition, which is needed for primaldual local analysis in any case. Moreover, in some situations even secondorder sufficiency can be relaxed to the weaker assumption that the multiplier in question is noncritical. Our study is performed for a rather general perturbed SQP framework, which covers in addition to SQP and quasiNewton SQP some other algorithms as well. For example, as a byproduct,
Math. Program., Ser. A DOI 10.1007/s1010700902794 FULL LENGTH PAPER
, 2008
"... On attraction of linearly constrained Lagrangian methods and of stabilized and quasiNewton SQP methods to critical multipliers ..."
Abstract
 Add to MetaCart
On attraction of linearly constrained Lagrangian methods and of stabilized and quasiNewton SQP methods to critical multipliers
by FAPERJ.
"... Abstract The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of [11], the key to local superlinear conv ..."
Abstract
 Add to MetaCart
Abstract The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of [11], the key to local superlinear convergence of sSQP are the following two properties: upper Lipschitzian behavior of solutions of the KarushKuhnTucker (KKT) system under canonical perturbations and local solvability of sSQP subproblems with the associated primaldual step being of the order of the distance from the current iterate to the solution set of the unperturbed KKT system. According to [9], both of these properties are ensured by the secondorder sufficient optimality condition (SOSC) without any constraint qualification assumptions. In this paper, we state precise relationships between the upper Lipschitzian property of solutions of KKT systems, error bounds for KKT systems, the notion of critical Lagrange multipliers (a subclass of multipliers that violate SOSC in a very special way), the secondorder necessary condition for optimality, and solvability of sSQP subproblems. Moreover, for the problem with equality constraints only, we prove superlinear convergence of sSQP under the assumption that the dual starting point is close to a noncritical multiplier. Since noncritical multipliers include all those satisfying SOSC but are not
FULL LENGTH PAPER
"... Stabilized sequential quadratic programming for optimization and a stabilized Newtontype method for variational problems ..."
Abstract
 Add to MetaCart
Stabilized sequential quadratic programming for optimization and a stabilized Newtontype method for variational problems
and 471267/20074, by PRONEX–Optimization, and by FAPERJ.
"... On attraction of linearly constrained Lagrangian methods and of stabilized and quasiNewton SQP methods to critical multipliers ..."
Abstract
 Add to MetaCart
On attraction of linearly constrained Lagrangian methods and of stabilized and quasiNewton SQP methods to critical multipliers
SHARP PRIMAL SUPERLINEAR CONVERGENCE RESULTS FOR SOME NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION ∗
"... Abstract. As is well known, Qsuperlinear or Qquadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply Qsuperlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic ..."
Abstract
 Add to MetaCart
Abstract. As is well known, Qsuperlinear or Qquadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply Qsuperlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQP) algorithm, local primaldual quadratic convergence can be established under the assumptions of uniqueness of the Lagrange multiplier associated to the solution and the secondorder sufficient condition. At the same time, previous primal Qsuperlinear convergence results for SQP required strengthening of the first assumption to the linear independence constraint qualification. In this paper, we show that this strengthening of assumptions is actually not necessary. Specifically, we show that once primaldual convergence is assumed or already established, for primal superlinear rate one needs only a certain error bound estimate. This error bound holds, for example, under the secondorder sufficient condition, which is needed for primaldual local analysis in any case. Moreover, in some situations even secondorder sufficiency can be relaxed to the weaker assumption that the multiplier in question is noncritical. Our study is performed for a rather general perturbed SQP framework which covers, in addition to SQP and quasiNewton SQP, some other algorithms as well. For example, as a byproduct, we obtain primal Qsuperlinear convergence results for the linearly constrained (augmented) Lagrangian methods for which no primal Qsuperlinear rate of convergence results were previously available. Another application of the general framework is sequential quadratically constrained quadratic programming methods. Finally, we discuss some difficulties with proving primal superlinear convergence for the stabilized version of SQP. Key words. Newton methods, sequential quadratic programming, linearly constrained Lagrangian methods, superlinear convergence, critical multipliers
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING: A SURVEY ∗
, 2013
"... We review the motivation for, the current stateoftheart in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globaliz ..."
Abstract
 Add to MetaCart
We review the motivation for, the current stateoftheart in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globalization challenges, and extentions of the method to the more general variational problems. Key words: Newton method, (stabilized) sequential quadratic programming, constrained optimization, variational problem, secondorder sufficiency, (non)critical Lagrange multipliers.