Results 1  10
of
20
SNOPT: An SQP Algorithm For LargeScale Constrained Optimization
, 2002
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 597 (24 self)
 Add to MetaCart
(Show Context)
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse. We discuss
User's Guide For SNOPT 5.3: A Fortran Package For LargeScale Nonlinear Programming
, 1999
"... SNOPT is a generalpurpose system for solving optimization problems involving many variables and constraints. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming ..."
Abstract

Cited by 96 (2 self)
 Add to MetaCart
(Show Context)
SNOPT is a generalpurpose system for solving optimization problems involving many variables and constraints. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming and for linearly constrained optimization, as well as for general nonlinear programs. SNOPT finds solutions that are locally optimal , and ideally any nonlinear functions should be smooth and users should provide gradients. It is often more widely useful. For example, local optima are often global solutions, and discontinuities in the function gradients can often be tolerated if they are not too close to an optimum. Unknown gradients are estimated by finite differences. SNOPT uses a sequential quadratic programming (SQP) algorithm that obtains search directions from a sequence of quadratic programming subproblems. Each QP subproblem minimizes a quadratic model of a certain Lagrangian function subject to a linearization of the constraints. An augmented Lagrangian merit function is reduced along each search direction to ensure convergence from any starting point. SNOPT is most efficient if only some of the variables enter nonlinearly, or if the number of active constraints (including simple bounds) is nearly as large as the number of variables. SNOPT requires relatively few evaluations of the problem functions. Hence it is especially effective if the objective or constraint functions (and their gradients) are expensive to evaluate. The source code for SNOPT is suitable for any machine with a Fortran compiler. SNOPT may be called from a driver program (typically in Fortran, C or MATLAB). SNOPT can also be used as a standalone package, reading data in the MPS ...
User’s Guide for SNOPT Version 7: Software for LargeScale Nonlinear Programming
"... SNOPT is a generalpurpose system for constrained optimization. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming and for linearly constrained optimization, as wel ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
SNOPT is a generalpurpose system for constrained optimization. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming and for linearly constrained optimization, as well as for general nonlinear programs. SNOPT finds solutions that are locally optimal, and ideally any nonlinear functions should be smooth and users should provide gradients. It is often more widely useful. For example, local optima are often global solutions, and discontinuities in the function gradients can often be tolerated if they are not too close to an optimum. Unknown gradients are estimated by finite differences. SNOPT uses a sequential quadratic programming (SQP) algorithm. Search directions are obtained from QP subproblems that minimize a quadratic model of the Lagrangian function subject to linearized constraints. An augmented Lagrangian merit function is reduced along each search direction to ensure convergence from any starting point.
SQP Methods And Their Application To Numerical Optimal Control
, 1997
"... . In recent years, generalpurpose sequential quadratic programming (SQP) methods have been developed that can reliably solve constrained optimization problems with many hundreds of variables and constraints. These methods require remarkably few evaluations of the problem functions and can be shown ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
. In recent years, generalpurpose sequential quadratic programming (SQP) methods have been developed that can reliably solve constrained optimization problems with many hundreds of variables and constraints. These methods require remarkably few evaluations of the problem functions and can be shown to converge to a solution under very mild conditions on the problem. Some practical and theoretical aspects of applying generalpurpose SQP methods to optimal control problems are discussed, including the influence of the problem discretization and the zero/nonzero structure of the problem derivatives. We conclude with some recent approaches that tailor the SQP method to the control problem. Key words. largescale optimization, sequential quadratic programming (SQP) methods, optimal control problems, multiple shooting methods, single shooting methods, collocation methods AMS subject classifications. 49J20, 49J15, 49M37, 49D37, 65F05, 65K05, 90C30 1. Introduction. Recently there has been c...
Numerical Optimal Control Of Parabolic PDEs Using DASOPT
, 1997
"... . This paper gives a preliminary description of DASOPT, a software system for the optimal control of processes described by timedependent partial differential equations (PDEs). DASOPT combines the use of efficient numerical methods for solving differentialalgebraic equations (DAEs) with a package ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
(Show Context)
. This paper gives a preliminary description of DASOPT, a software system for the optimal control of processes described by timedependent partial differential equations (PDEs). DASOPT combines the use of efficient numerical methods for solving differentialalgebraic equations (DAEs) with a package for largescale optimization based on sequential quadratic programming (SQP). DASOPT is intended for the computation of the optimal control of timedependent nonlinear systems of PDEs in two (and eventually three) spatial dimensions, including possible inequality constraints on the state variables. By the use of either finitedifference or finiteelement approximations to the spatial derivatives, the PDEs are converted into a large system of ODEs or DAEs. Special techniques are needed in order to solve this very large optimal control problem. The use of DASOPT is illustrated by its application to a nonlinear parabolic PDE boundary control problem in two spatial dimensions. Computational resu...
Users Guide to SNOPT Version 6: A Fortran Package for LargeScale Nonlinear Programming
, 2002
"... SNOPT is a generalpurpose system for solving optimization problems involving many variables and constraints. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming and ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
SNOPT is a generalpurpose system for solving optimization problems involving many variables and constraints. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for largescale linear and quadratic programming and for linearly constrained optimization, as well as for general nonlinear programs. SNOPT finds solutions that are locally optimal, and ideally any nonlinear functions should be smooth and users should provide gradients. It is often more widely useful. For example, local optima are often global solutions, and discontinuities in the function gradients can often be tolerated if they are not too close to an optimum. Unknown gradients are estimated by finite differences. SNOPT uses a sequential quadratic programming (SQP) algorithm that obtains search directions from a sequence of quadratic programming subproblems. Each QP subproblem minimizes a quadratic model of a certain Lagrangian function subject to a linearization of the constraints. An augmented Lagrangian merit function is reduced
LargeScale Nonlinear Constrained Optimization: A Current Survey
, 1994
"... . Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithm ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
. Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithms based upon trust regions and line searches. In addition, the importance of software, numerical linear algebra and testing will be addressed. We will try to explain why the difficulties arise, how attempts are being made to overcome them and some of the problems that still remain. Although there will be some emphasis on the LANCELOT and CUTE projects, the intention is to give a broad picture of the stateoftheart.
Numerical experience with a reduced Hessian method for largescale constrained optimization
 Research Report (in preparation), EE and CS, Northwestern
, 1993
"... The reduced Hessian SQP algorithm presented in [2] is developed in this paper into a practical method for largescale optimization. The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term Z T WYp Y. This improves the stability and robustness of ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
The reduced Hessian SQP algorithm presented in [2] is developed in this paper into a practical method for largescale optimization. The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term Z T WYp Y. This improves the stability and robustness of the algorithm without increasing its computational cost. The paper studies how to implement the algorithm e ciently, and presents a set of tests illustrating its numerical performance. An analytic example, showing the bene ts of the correction term, is also presented.
A Barrier Algorithm for Large Nonlinear Optimization Problems
, 2003
"... that I have read this dissertation and that, in my ..."