Results 1  10
of
12
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 114 (2 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
An interior point algorithm for large scale nonlinear programming
 SIAM Journal on Optimization
, 1999
"... The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of t ..."
Abstract

Cited by 74 (17 self)
 Add to MetaCart
The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of the algorithm are developed, and their performance is illustrated in a set of numerical tests. Key words: constrained optimization, interior point method, largescale optimization, nonlinear programming, primal method, primaldual method, successive quadratic programming, trust region method.
A Sqp Method For General Nonlinear Programs Using Only Equality Constrained Subproblems
 MATHEMATICAL PROGRAMMING
, 1993
"... In this paper we describe a new version of a sequential equality constrained quadratic programming method for general nonlinear programs with mixed equality and inequality constraints. Compared with an older version [34] it is much simpler to implement and allows any kind of changes of the working s ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
In this paper we describe a new version of a sequential equality constrained quadratic programming method for general nonlinear programs with mixed equality and inequality constraints. Compared with an older version [34] it is much simpler to implement and allows any kind of changes of the working set in every step. Our method relies on a strong regularity condition. As far as it is applicable the new approach is superior to conventional SQPmethods, as demonstrated by extensive numerical tests.
A reduced Hessian method for largescale constrained optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1995
"... ..."
A Parallel Reduced Hessian SQP Method for Shape Optimization
"... We present a parallel reduced Hessian SQP method for smooth shape optimization of systems governed by nonlinear boundary value problems, for the case when the number of shape variables is much smaller than the number of state variables. The method avoids nonlinear resolution of the state equation ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
We present a parallel reduced Hessian SQP method for smooth shape optimization of systems governed by nonlinear boundary value problems, for the case when the number of shape variables is much smaller than the number of state variables. The method avoids nonlinear resolution of the state equations at each design iteration by embedding them as equality constraints in the optimization problem. It makes use of a decomposition into nonorthogonal subspaces that exploits Jacobian and Hessian sparsity in an optimal fashion. The resulting algorithm requires the solution at each iteration of just two linear systems whose coefficients matrices are the state variable Jacobian of the state equations, i.e. the stiffness matrix, and its transpose. The construction and solution of each of these two systems is performed in parallel, as are sensitivity computations associated with the state variables. The conventional parallelism present in a parallel PDE solverboth constructing and solvi...
LargeScale Nonlinear Constrained Optimization: A Current Survey
, 1994
"... . Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithm ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
. Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithms based upon trust regions and line searches. In addition, the importance of software, numerical linear algebra and testing will be addressed. We will try to explain why the difficulties arise, how attempts are being made to overcome them and some of the problems that still remain. Although there will be some emphasis on the LANCELOT and CUTE projects, the intention is to give a broad picture of the stateoftheart. 1 IBM T.J. Watson Research Center, P.O.Box 218, Yorktown Heights, NY 10598, USA 2 Parallel Algorithms Team, CERFACS, 42 Ave. G. Coriolis, 31057 Toulouse Cedex, France 3 Central Computing Department, Rutherford Appleton Laboratory, Chilton, Oxfordshire, OX11 0QX, England ...
On the realization of the Wolfe conditions in reduced quasiNewton methods for equality constrained optimization
 SIAM Journal on Optimization
, 1997
"... Abstract. This paper describes a reduced quasiNewton method for solving equality constrained optimization problems. A major difficulty encountered by this type of algorithm is the design of a consistent technique for maintaining the positive definiteness of the matrices approximating the reduced He ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. This paper describes a reduced quasiNewton method for solving equality constrained optimization problems. A major difficulty encountered by this type of algorithm is the design of a consistent technique for maintaining the positive definiteness of the matrices approximating the reduced Hessian of the Lagrangian. A new approach is proposed in this paper. The idea is to search for the next iterate along a piecewise linear path. The path is designed so that some generalized Wolfe conditions can be satisfied. These conditions allow the algorithm to sustain the positive definiteness of the matrices from iteration to iteration by a mechanism that has turned out to be efficient in unconstrained optimization.
A Piecewise LineSearch Technique for Maintaining the Positive Definiteness of the Matrices in the SQP Method
, 1997
"... Abstract. A technique for maintaining the positive definiteness of the matrices in the quasiNewton version of the SQP algorithm is proposed. In our algorithm, matrices approximating the Hessian of the augmented Lagrangian are updated. The positive definiteness of these matrices in the space tangent ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. A technique for maintaining the positive definiteness of the matrices in the quasiNewton version of the SQP algorithm is proposed. In our algorithm, matrices approximating the Hessian of the augmented Lagrangian are updated. The positive definiteness of these matrices in the space tangent to the constraint manifold is ensured by a socalled piecewise linesearch technique, while their positive definiteness in a complementary subspace is obtained by setting the augmentation parameter. In our experiment, the combination of these two ideas leads to a new algorithm that turns out to be more robust and often improves the results obtained with other approaches.
Numerical experience with a reduced Hessian method for largescale constrained optimization
 Research Report (in preparation), EE and CS, Northwestern
, 1993
"... The reduced Hessian SQP algorithm presented in [2] is developed in this paper into a practical method for largescale optimization. The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term Z T WYp Y. This improves the stability and robustness of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The reduced Hessian SQP algorithm presented in [2] is developed in this paper into a practical method for largescale optimization. The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term Z T WYp Y. This improves the stability and robustness of the algorithm without increasing its computational cost. The paper studies how to implement the algorithm e ciently, and presents a set of tests illustrating its numerical performance. An analytic example, showing the bene ts of the correction term, is also presented.
A QuasiNewton Quadratic Penalty Method For Minimization Subject To Nonlinear Equality Constraints
"... . We present a modified quadratic penalty function method for equality constrained optimization problems. The pivotal feature of our algorithm is that at every iterate we invoke a special change of variables to improve the ability of the algorithm to follow the constraint level sets. This change of ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. We present a modified quadratic penalty function method for equality constrained optimization problems. The pivotal feature of our algorithm is that at every iterate we invoke a special change of variables to improve the ability of the algorithm to follow the constraint level sets. This change of variables gives rise to a suitable block diagonal approximation to the Hessian which is then used to construct a quasiNewton method. We show that the complete algorithm is globally convergent. Preliminary computational results are reported. Key words. nonlinearly constrained optimization, equality constraints, quasiNewton methods, BFGS, quadratic penalty function, reduced Hessian approximation AMS(MOS) subject classifications. 65K05, 65K10, 65H10, 90C30, 90C05, 68L10 1. Introduction. One of the great success stories in continuous optimization is the development of effective quasiNewton methods for unconstrained minimization (at least for problems of moderate size). Three important reas...