Results 1  10
of
36
SNOPT: An SQP Algorithm For LargeScale Constrained Optimization
, 2002
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 597 (24 self)
 Add to MetaCart
(Show Context)
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse. We discuss
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 166 (4 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
An interior point algorithm for largescale nonlinear . . .
, 2002
"... Nonlinear programming (NLP) has become an essential tool in process engineering, leading to prot gains through improved plant designs and better control strategies. The rapid advance in computer technology enables engineers to consider increasingly complex systems, where existing optimization codes ..."
Abstract

Cited by 64 (3 self)
 Add to MetaCart
Nonlinear programming (NLP) has become an essential tool in process engineering, leading to prot gains through improved plant designs and better control strategies. The rapid advance in computer technology enables engineers to consider increasingly complex systems, where existing optimization codes reach their practical limits. The objective of this dissertation is the design, analysis, implementation, and evaluation of a new NLP algorithm that is able to overcome the current bottlenecks, particularly in the area of process engineering. The proposed algorithm follows an interior point approach, thereby avoiding the combinatorial complexity of identifying the active constraints. Emphasis is laid on exibility in the computation of search directions, which allows the tailoring of the method to individual applications and is mandatory for the solution of very large problems. In a fullspace version the method can be used as general purpose NLP solver, for example in modeling environments such as Ampl. The reduced space version, based on coordinate decomposition, makes it possible to tailor linear algebra
A reduced Hessian method for largescale constrained optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1995
"... ..."
Quadratically And Superlinearly Convergent Algorithms For The Solution Of Inequality Constrained Minimization Problems
, 1995
"... . In this paper some Newton and quasiNewton algorithms for the solution of inequality constrained minimization problems are considered. All the algorithms described produce sequences fx k g converging qsuperlinearly to the solution. Furthermore, under mild assumptions, a qquadratic convergence ra ..."
Abstract

Cited by 35 (12 self)
 Add to MetaCart
. In this paper some Newton and quasiNewton algorithms for the solution of inequality constrained minimization problems are considered. All the algorithms described produce sequences fx k g converging qsuperlinearly to the solution. Furthermore, under mild assumptions, a qquadratic convergence rate in x is also attained. Other features of these algorithms are that the solution of linear systems of equations only is required at each iteration and that the strict complementarity assumption is never invoked. First the superlinear or quadratic convergence rate of a Newtonlike algorithm is proved. Then, a simpler version of this algorithm is studied and it is shown that it is superlinearly convergent. Finally, quasiNewton versions of the previous algorithms are considered and, provided the sequence defined by the algorithms converges, a characterization of superlinear convergence extending the result of Boggs, Tolle and Wang is given. Key Words. Inequality constrained optimization, New...
A sequential quadratic programming algorithm using an incomplete solution of the subproblem
 SIAM Journal of Optimization
, 1995
"... Ary opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do NOT necessarily reflect the views of the above sponsors. ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
(Show Context)
Ary opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do NOT necessarily reflect the views of the above sponsors.
Some theoretical properties of an augmented Lagrangian merit function
 in Advances in Optimization and Parallel Computing
, 1992
"... Sequential quadratic programming (SQP) methods for nonlinearly constrained optimization typically use a merit function to enforce convergence from an arbitrary starting point. We define a smooth augmented Lagrangian merit function in which the Lagrange multiplier estimate is treated as a separate v ..."
Abstract

Cited by 28 (7 self)
 Add to MetaCart
(Show Context)
Sequential quadratic programming (SQP) methods for nonlinearly constrained optimization typically use a merit function to enforce convergence from an arbitrary starting point. We define a smooth augmented Lagrangian merit function in which the Lagrange multiplier estimate is treated as a separate variable, and inequality constraints are handled by means of nonnegative slack variables that are included in the linesearch. Global convergence is proved for an SQP algorithm that uses this merit function. We also prove that steps of unity are accepted in a neighborhood of the solution when this merit function is used in a suitable superlinearly convergent algorithm. Finally, some numerical results are presented to illustrate the performance of the associated SQP method.
A Global Convergence Analysis Of An Algorithm For Large Scale Nonlinear Optimization Problems
, 1996
"... . In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical perform ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
. In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical performance; some of these are also analyzed. The important features of the algorithm include the use of a constrained merit function to assess the progress of the iterates and a sequence of approximate merit functions that are less expensive to evaluate. It also employs an interior point quadratic programming solver that can be terminated early to produce a truncated step. Key words. Sequential Quadratic Programming, Global Convergence, Merit Function, Large Scale Problems. AMS subject classifications. 49M37, 65K05, 90C30 1. Introduction. In this report we consider an algorithm to solve the inequalityconstrained minimization problem, min x f(x) subject to: g(x) 0; (1.1) where x 2 R n , and...
Global and local convergence of line search filter methods for nonlinear programming
, 1521
"... Line search methods for nonlinear programming using Fletcher and Leyffer’s filter method, which replaces the traditional merit function, are proposed and their global and local convergence properties are analyzed. Previous theoretical work on filter methods has considered trust region algorithms and ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
(Show Context)
Line search methods for nonlinear programming using Fletcher and Leyffer’s filter method, which replaces the traditional merit function, are proposed and their global and local convergence properties are analyzed. Previous theoretical work on filter methods has considered trust region algorithms and only the question of global convergence. The presented framework is applied to barrier interior point and active set SQP algorithms. Under mild assumptions it is shown that every limit point of the sequence of iterates generated by the algorithm is feasible, and that there exists at least one limit point that is a stationary point for the problem under consideration. Furthermore, it is shown that the proposed methods do not suffer from the Maratos effect if the search directions are improved by second order corrections, so that fast local convergence to strict local solutions is achieved. A new alternative filter approach employing the Lagrangian function instead of the objective function with identical global convergence properties is briefly discussed.