Results 1  10
of
21
Snopt: An SQP Algorithm For LargeScale Constrained Optimization
, 1997
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 328 (18 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse.
INEXACT JOSEPHY–NEWTON FRAMEWORK FOR GENERERALIZED EQUATIONS AND ITS APPLICATIONS TO LOCAL ANALYSIS OF NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION ∗
, 2008
"... We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic progr ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic programming, and linearly constrained Lagrangian methods. For the linearly constrained Lagrangian methods, in particular, we obtain superlinear convergence under the secondorder sufficient optimality condition and the strict Mangasarian–Fromovitz constraint qualification, while previous results in the literature assume (in addition to secondorder sufficiency) the stronger linear independence constraint qualification as well as the strict complementarity condition. For the sequential quadratically constrained quadratic programming methods, we prove primaldual superlinear/quadratic convergence under the same assumptions as above, which also gives a new result.
A twosided relaxation scheme for mathematical programs with equilibrium constraints
 SIAM J. Optim
, 2005
"... Abstract. We propose a relaxation scheme for mathematical programs with equilibrium constraints (MPECs). In contrast to previous approaches, our relaxation is twosided: both the complementarity and the nonnegativity constraints are relaxed. The proposed relaxation update rule guarantees (under cert ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Abstract. We propose a relaxation scheme for mathematical programs with equilibrium constraints (MPECs). In contrast to previous approaches, our relaxation is twosided: both the complementarity and the nonnegativity constraints are relaxed. The proposed relaxation update rule guarantees (under certain conditions) that the sequence of relaxed subproblems will maintain a strictly feasible interior—even in the limit. We show how the relaxation scheme can be used in combination with a standard interiorpoint method to achieve superlinear convergence. Numerical results on the MacMPEC test problem set demonstrate the fast local convergence properties of the approach. Key words. nonlinear programming, mathematical programs with equilibrium constraints, complementarity constraints, constrained minimization, interiorpoint methods, primaldual methods,
SHARP PRIMAL SUPERLINEAR CONVERGENCE RESULTS FOR SOME NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION
, 2009
"... As is well known, superlinear or quadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply superlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQ ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
As is well known, superlinear or quadratic convergence of the primaldual sequence generated by an optimization algorithm does not, in general, imply superlinear convergence of the primal part. Primal convergence, however, is often of particular interest. For the sequential quadratic programming (SQP) algorithm, local primaldual quadratic convergence can be established under the assumptions of uniqueness of the Lagrange multiplier associated to the solution and the secondorder sufficient condition. At the same time, previous primal superlinear convergence results for SQP required to strengthen the first assumption to the linear independence constraint qualification. In this paper, we show that this strengthening of assumptions is actually not necessary. Specifically, we show that once primaldual convergence is assumed or already established, for primal superlinear rate one only needs a certain error bound estimate. This error bound holds, for example, under the secondorder sufficient condition, which is needed for primaldual local analysis in any case. Moreover, in some situations even secondorder sufficiency can be relaxed to the weaker assumption that the multiplier in question is noncritical. Our study is performed for a rather general perturbed SQP framework, which covers in addition to SQP and quasiNewton SQP some other algorithms as well. For example, as a byproduct,
A primaldual augmented Lagrangian
 Computational Optimization and Applications
, 2010
"... Abstract. Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we discuss the formulation of subproblems in which the objective is a primaldual generalization of the HestenesPowell augmente ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Abstract. Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we discuss the formulation of subproblems in which the objective is a primaldual generalization of the HestenesPowell augmented Lagrangian function. This generalization has the crucial feature that it is minimized with respect to both the primal and the dual variables simultaneously. A benefit of this approach is that the quality of the dual variables is monitored explicitly during the solution of the subproblem. Moreover, each subproblem may be regularized by imposing explicit bounds on the dual variables. Two primaldual variants of conventional primal methods are proposed: a primaldual bound constrained Lagrangian (pdBCL) method and a primaldual ℓ1 linearly constrained Lagrangian (pdℓ1LCL) method. Key words. Nonlinear programming, nonlinear inequality constraints, augmented Lagrangian methods, bound constrained Lagrangian methods, linearly constrained Lagrangian methods, primaldual methods. AMS subject classifications. 49J20, 49J15, 49M37, 49D37, 65F05, 65K05, 90C30
A TRUNCATED SQP METHOD BASED ON INEXACT INTERIORPOINT SOLUTIONS OF SUBPROBLEMS ∗
"... Abstract. We consider sequential quadratic programming (SQP) methods applied to optimization problems with nonlinear equality constraints and simple bounds. In particular, we propose and analyze a truncated SQP algorithm in which subproblems are solved approximately by an infeasible predictorcorrec ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Abstract. We consider sequential quadratic programming (SQP) methods applied to optimization problems with nonlinear equality constraints and simple bounds. In particular, we propose and analyze a truncated SQP algorithm in which subproblems are solved approximately by an infeasible predictorcorrector interiorpoint method, followed by setting to zero some variables and some multipliers so that complementarity conditions for approximate solutions are enforced. Verifiable truncation conditions based on the residual of optimality conditions of subproblems are developed to ensure both global and fast local convergence. Global convergence is established under assumptions that are standard for linesearch SQP with exact solution of subproblems. The local superlinear convergence rate is shown under the weakest assumptions that guarantee this property for pure SQP with exact solution of subproblems, namely, the strict Mangasarian–Fromovitz constraint qualification and secondorder sufficiency. Local convergence results for our truncated method are presented as a special case of the local convergence for a more general perturbed SQP framework, which is of independent interest and is applicable even to some algorithms whose subproblems are not quadratic programs. For example, the framework can also be used to derive sharp local convergence results for linearly constrained Lagrangian methods. Preliminary numerical results confirm that it can be indeed beneficial to solve subproblems approximately, especially on early iterations. Key words. sequential quadratic programming, inexact sequential quadratic programming, truncated sequential quadratic programming, interiorpoint method, superlinear convergence
Augmented Lagrangian Techniques for Solving Saddle Point Linear Systems
 SIAM J. Matrix Anal. Appl
, 2004
"... We perform an algebraic analysis of a generalization of the augmented Lagrangian method for solution of saddle point linear systems. It is shown that in cases where the (1,1) block is singular, specifically semidefinite, a lowrank perturbation that minimizes the condition number of the perturbed ma ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We perform an algebraic analysis of a generalization of the augmented Lagrangian method for solution of saddle point linear systems. It is shown that in cases where the (1,1) block is singular, specifically semidefinite, a lowrank perturbation that minimizes the condition number of the perturbed matrix while maintaining sparsity is an e#ective approach. The vectors used for generating the perturbation are columns of the constraint matrix that form a small angle with the nullspace of the original (1,1) block. Block preconditioning techniques of a similar flavor are also discussed and analyzed, and the theoretical observations are illustrated and validated by numerical results.
GLOBAL AND FINITE TERMINATION OF A TWOPHASE AUGMENTED LAGRANGIAN FILTER METHOD FOR GENERAL QUADRATIC PROGRAMS
, 2007
"... We present a twophase algorithm for solving largescale quadratic programs (QPs). In the first phase, gradientprojection iterations approximately minimize a boundconstrained augmented Lagrangian function and provide an estimate of the optimal active set. In the second phase, an equalityconstrai ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We present a twophase algorithm for solving largescale quadratic programs (QPs). In the first phase, gradientprojection iterations approximately minimize a boundconstrained augmented Lagrangian function and provide an estimate of the optimal active set. In the second phase, an equalityconstrained QP defined by the current active set is approximately minimized in order to generate a secondorder search direction. A filter determines the required accuracy of the subproblem solutions and provides an acceptance criterion for the search directions. The resulting algorithm is globally and finitely convergent. The algorithm is suitable for largescale problems with many degrees of freedom, and provides an alternative to interiorpoint methods when iterative methods must be used to solve the underlying linear systems. Numerical experiments on a subset of the CUTEr QP test problems demonstrate the effectiveness of the approach.