Results 1  10
of
11
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 115 (2 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
A Practical Algorithm For General Large Scale Nonlinear Optimization Problems
 SIAM Journal on Optimization
, 1994
"... . We provide an effective and efficient implementation of a sequential quadratic programming (SQP) algorithm for the general large scale nonlinear programming problem. In this algorithm the quadratic programming subproblems are solved by an interior point method that can be prematurely halted by a t ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
. We provide an effective and efficient implementation of a sequential quadratic programming (SQP) algorithm for the general large scale nonlinear programming problem. In this algorithm the quadratic programming subproblems are solved by an interior point method that can be prematurely halted by a trust region constraint. Numerous computational enhancements to improve the numerical performance are presented. These include a dynamic procedure for adjusting the merit function parameter and procedures for adjusting the trust region radius. Numerical results and comparisons are presented. Key words: nonlinear programming, interior point, SQP, merit function, trust region, large scale 1. Introduction. In a series of recent papers, [3], [6], and [8], the authors have developed a new algorithmic approach for solving large, nonlinear, constrained optimization problems. This proposed procedure is, in essence, a sequential quadratic programming (SQP) method that uses an interior point algorithm...
A Global Convergence Analysis Of An Algorithm For Large Scale Nonlinear Optimization Problems
, 1996
"... . In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical perform ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
. In this paper we give a global convergence analysis of a basic version of an SQP algorithm described in [2] for the solution of large scale nonlinear inequalityconstrained optimization problems. Several procedures and options have been added to the basic algorithm to improve the practical performance; some of these are also analyzed. The important features of the algorithm include the use of a constrained merit function to assess the progress of the iterates and a sequence of approximate merit functions that are less expensive to evaluate. It also employs an interior point quadratic programming solver that can be terminated early to produce a truncated step. Key words. Sequential Quadratic Programming, Global Convergence, Merit Function, Large Scale Problems. AMS subject classifications. 49M37, 65K05, 90C30 1. Introduction. In this report we consider an algorithm to solve the inequalityconstrained minimization problem, min x f(x) subject to: g(x) 0; (1.1) where x 2 R n , and...
Computational Experience of an InteriorPoint SQP Algorithm in a Parallel BranchandBound Framework
"... An interiorpoint algorithm within a parallel branchandbound framework for solving nonlinear mixed integer programs is described. The nonlinear programming relaxations at each node are solved using an interior point SQP method. In contrast to solving the relaxation to optimality at each tree node ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
An interiorpoint algorithm within a parallel branchandbound framework for solving nonlinear mixed integer programs is described. The nonlinear programming relaxations at each node are solved using an interior point SQP method. In contrast to solving the relaxation to optimality at each tree node, the relaxation is only solved to nearoptimality. Analogous to employing advanced bases in simplexbased linear MIP solvers, a “dynamic” collection of warmstart vectors is kept to provide “advanced warmstarts” at each branchandbound node. The code has the capability to run in both sharedmemory and distributedmemory parallel environments. Preliminary computational results on various classes of linear mixed integer programs and quadratic portfolio problems are presented.
Optimal Signal Sets For NonGaussian Detectors
 SIAM Journal on Optimization
, 1997
"... . Identifying a maximallyseparated set of signals is important in the design of modems. The notion of optimality is dependent on the model chosen to describe noise in the measurements; while some analytic results can be derived under the assumption of Gaussian noise, no such techniques are known fo ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
. Identifying a maximallyseparated set of signals is important in the design of modems. The notion of optimality is dependent on the model chosen to describe noise in the measurements; while some analytic results can be derived under the assumption of Gaussian noise, no such techniques are known for choosing signal sets in the nonGaussian case. To obtain numerical solutions for nonGaussian detectors, minimax problems are transformed into nonlinear programs, resulting in a novel formulation yielding problems with relatively few variables and many inequality constraints. Using sequential quadratic programming, optimal signal sets are obtained for a variety of noise distributions. Key words. Optimal Design, Inequality Constraints, Sequential Quadratic Programming Contribution of the National Institute of Standards and Technology and not subject to copyright in the United States. y Department of Mathematics, University of Michigan, Ann Arbor, MI 48109 z Mathematical and Computationa...
Sequential Quadratic Programming for LargeScale Nonlinear Optimization
 I⋅E I +w S⋅E S ES EI located Pareto optimum (a) (b) ZR E=w I⋅E I +w S⋅E S
, 1999
"... The sequential quadratic programming (SQP) algorithm has been one of the most successful general methods for solving nonlinear constrained optimization problems. We provide an introduction to the general method and show its relationship to recent developments in interiorpoint approaches. We emph ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The sequential quadratic programming (SQP) algorithm has been one of the most successful general methods for solving nonlinear constrained optimization problems. We provide an introduction to the general method and show its relationship to recent developments in interiorpoint approaches. We emphasize largescale aspects. Key words: sequential quadratic programming, nonlinear optimization, Newton methods, interiorpoint methods, local convergence, global convergence ? Contribution of Sandia National Laboratories and not subject to copyright in the United States. Preprint submitted to Elsevier Preprint 1 July 1999 1 Introduction In this article we consider the general method of Sequential Quadratic Programming (hereafter denoted SQP) for solving the nonlinear programming problem minimize f(x) x subject to: h(x) = 0 g(x) 0 (NLP) where f : R n ! R, h : R n ! R m , and g : R n ! R p . Broadly defined, the SQP method is a procedure that generates iterates converging ...
Computational experience of an interior point algorithm in a parallel branchandcut framework
 IN PROCEEDINGS FOR SIAM CONFERENCE ON PARALLEL PROCESSING FOR SCIENTIFIC COMPUTING
, 1997
"... An interiorpoint algorithm within a branchandbound framework for solving nonlinear mixed integer programs is described. In contrast to solving the relaxation to optimality at each tree node, the relaxation is only solved to nearoptimality. Analogous to using advanced bases for warmstart solution ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
An interiorpoint algorithm within a branchandbound framework for solving nonlinear mixed integer programs is described. In contrast to solving the relaxation to optimality at each tree node, the relaxation is only solved to nearoptimality. Analogous to using advanced bases for warmstart solutions in the case of linear MIP, a "dynamic" collection of warmstart vectors is kept. Computational results on various classes of nonlinear mixed integer programs are presented.
On the Convergence of a Trust Region SQP Algorithm for Nonlinearly Constrained Optimization Problems
, 1995
"... In (Boggs, Tolle and Kearsley 1994b) the authors introduced an effective algorithm for general large scale nonlinear programming problems. In this paper we describe the theoretical foundation for this method. The algorithm is based on a trust region, sequential quadratic programming (SQP) technique ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In (Boggs, Tolle and Kearsley 1994b) the authors introduced an effective algorithm for general large scale nonlinear programming problems. In this paper we describe the theoretical foundation for this method. The algorithm is based on a trust region, sequential quadratic programming (SQP) technique and uses a special auxiliary function, called a merit function or linesearch function, for assessing the steps that are generated. A global convergence theorem for a basic version of the algorithm is stated and its proof is outlined.
RICE UNIVERSITY The Use of Optimization Techniques in the Solution of Partial Differential Equations from
, 1996
"... Acknowledgments This thesis is a very important milestone in a journey I began more than ten years ago. People too numerous to mention have helped me along the way; a few are singled out here. When I was an undergraduate at the University of Maryland, Baltimore County, the Mathematics faculty, in pa ..."
Abstract
 Add to MetaCart
Acknowledgments This thesis is a very important milestone in a journey I began more than ten years ago. People too numerous to mention have helped me along the way; a few are singled out here. When I was an undergraduate at the University of Maryland, Baltimore County, the Mathematics faculty, in particular Professors James Greenberg, So/ren Jensen, and Marc Teboulle, taught me to love applied mathematics; their patience with me was endless and I will always be grateful to them.