Results 1  10
of
33
Snopt: An SQP Algorithm For LargeScale Constrained Optimization
, 1997
"... Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first deriv ..."
Abstract

Cited by 328 (18 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available, and that the constraint gradients are sparse.
An interior point algorithm for large scale nonlinear programming
 SIAM Journal on Optimization
, 1999
"... The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of t ..."
Abstract

Cited by 74 (17 self)
 Add to MetaCart
The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of the algorithm are developed, and their performance is illustrated in a set of numerical tests. Key words: constrained optimization, interior point method, largescale optimization, nonlinear programming, primal method, primaldual method, successive quadratic programming, trust region method.
A survey of constraint handling techniques in evolutionary computation methods
 Proceedings of the 4th Annual Conference on Evolutionary Programming
, 1995
"... One of the major components of any evolutionary system is the eval� uation function. Evaluation functions are used to assign a quality measure for individuals in a population. Whereas evolutionary com� putation techniques assume the existence of an �e�cient � evaluation function for feasible individ ..."
Abstract

Cited by 74 (3 self)
 Add to MetaCart
One of the major components of any evolutionary system is the eval� uation function. Evaluation functions are used to assign a quality measure for individuals in a population. Whereas evolutionary com� putation techniques assume the existence of an �e�cient � evaluation function for feasible individuals � there is no uniform methodology for handling �i.e. � evaluating � unfeasible ones. The simplest approach� incorporated by evolution strategies and a version of evolutionary programming �for numerical optimization problems� � is to reject un� feasible solutions. But several other methods for handling unfeasible individuals have emerged recently. This paper reviews such methods �using a domain of nonlinear programming problems � and discusses their merits and drawbacks. 1
Complete search in continuous global optimization and constraint satisfaction, Acta Numerica 13
, 2004
"... A chapter for ..."
User's Guide for CFSQP Version 2.5: A C Code for Solving (Large Scale) Constrained Nonlinear (Minimax) Optimization Problems, Generating Iterates Satisfying All Inequality Constraints
, 1997
"... CFSQP is a set of C functions for the minimization of the maximum of a set of smooth objective functions (possibly a single one, or even none at all) subject to general smooth constraints (if there is no objective function, the goal is to simply find a point satisfying the constraints). If the initi ..."
Abstract

Cited by 55 (1 self)
 Add to MetaCart
CFSQP is a set of C functions for the minimization of the maximum of a set of smooth objective functions (possibly a single one, or even none at all) subject to general smooth constraints (if there is no objective function, the goal is to simply find a point satisfying the constraints). If the initial guess provided by the user is infeasible for some inequality constraint or some linear equality constraint, CFSQP first generates a feasible point for these constraints; subsequently the successive iterates generated by CFSQP all satisfy these constraints. Nonlinear equality constraints are turned into inequality constraints (to be satisfied by all iterates) and the maximum of the objective functions is replaced by an exact penalty function which penalizes nonlinear equality constraint violations only. When solving problems with many sequentially related constraints (or objectives), such as discretized semiinfinite programming (SIP) problems, CFSQP gives the user the option to use an algo...
A New Nonsmooth Equations Approach To Nonlinear Complementarity Problems
, 1997
"... Based on Fischer's function, a new nonsmooth equations approach is presented for solving nonlinear complementarity problems. Under some suitable assumptions, a local and Qquadratic convergence result is established for the generalized Newton method applied to the system of nonsmooth equations, whic ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
Based on Fischer's function, a new nonsmooth equations approach is presented for solving nonlinear complementarity problems. Under some suitable assumptions, a local and Qquadratic convergence result is established for the generalized Newton method applied to the system of nonsmooth equations, which is a reformulation of nonlinear complementarity problems. To globalize the generalized Newton method, a hybrid method combining the generalized Newton method with the steepest descent method is proposed. Global and Qquadratic convergence is established for this hybrid method. Some numerical results are also reported.
On the convergence of a sequential quadratic programming method with an augmented Lagrangian line search function
 Math. Operstionsforschung und Statistik, Ser. Optimization
, 1983
"... Sequential quadratic programming (SQP) methods are widely used for solving practical optimization problems, especially in structural mechanics. The general structure of SQP methods is briefly introduced and it is shown how these methods can be adapted to distributed computing. However, SQP methods a ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods are widely used for solving practical optimization problems, especially in structural mechanics. The general structure of SQP methods is briefly introduced and it is shown how these methods can be adapted to distributed computing. However, SQP methods are sensitive subject to errors in function and gradient evaluations. Typically they break down with an error message reporting that the line search cannot be terminated successfully. In these cases, a new nonmonotone line search is activated. In case of noisy function values, a drastic improvement of the performance is achieved compared to the version with monotone line search. Numerical results are presented for a set of more than 300 standard test examples.
A PrimalDual InteriorPoint Method for Nonlinear Programming with Strong Global and Local Convergence Properties
 SIAM Journal on Optimization
, 2002
"... An exactpenaltyfunctionbased schemeinspired from an old idea due to Mayne and Polak (Math. Prog., vol. 11, 1976, pp. 6780)is proposed for extending to general smooth constrained optimization problems any given feasible interiorpoint method for inequality constrained problems. It is s ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
An exactpenaltyfunctionbased schemeinspired from an old idea due to Mayne and Polak (Math. Prog., vol. 11, 1976, pp. 6780)is proposed for extending to general smooth constrained optimization problems any given feasible interiorpoint method for inequality constrained problems. It is shown that the primaldual interiorpoint framework allows for a simpler penalty parameter update rule than that discussed and analyzed by the originators of the scheme in the context of first order methods of feasible direction. Strong global and local convergence results are proved under mild assumptions. In particular, (i) the proposed algorithm does not su#er a common pitfall # Department of Electrical and Computer Engineering and Institute for Systems Research, University of Maryland, College Park, MD 20742, USA + IBM T.J. Watson Research Center, Yorktown Heights, NY 10598, USA # Applied Physics Laboratory, Laurel, MD 20723, USA Alphatech, Arlington, VA 22203, USA recently pointed out by Wachter and Biegler; and (ii) the positive definiteness assumption on the Hessian estimate, made in the original version of the algorithm, is relaxed, allowing for the use of exact Hessian information, resulting in local quadratic convergence. Promising numerical results are reported.
A Hypergraph Framework For Optimal ModelBased Decomposition Of Design Problems
 Computational Optimization and Applications
, 1997
"... Decomposition of large engineering system models is desirable since increased model size reduces reliability and speed of numerical solution algorithms. The article presents a methodology for optimal modelbased decomposition (OMBD) of design problems, whether or not initially cast as optimization p ..."
Abstract

Cited by 30 (20 self)
 Add to MetaCart
Decomposition of large engineering system models is desirable since increased model size reduces reliability and speed of numerical solution algorithms. The article presents a methodology for optimal modelbased decomposition (OMBD) of design problems, whether or not initially cast as optimization problems. The overall model is represented by a hypergraph and is optimally partitioned into weakly connected subgraphs that satisfy decomposition constraints. Spectral graphpartitioning methods together with iterative improvement techniques are proposed for hypergraph partitioning. A known spectral Kpartitioning formulation, which accounts for partition sizes and edge weights, is extended to graphs with also vertex weights. The OMBD formulation is robust enough to account for computational demands and resources and strength of interdependencies between the computational modules contained in the model. KEYWORDS: Model decomposition, multidisciplinary design, hypergraph partitioning, larges...
On Combining Feasibility, Descent and Superlinear Convergence in Inequality Constrained Optimization
 Mathematical Programming
, 1993
"... . Extension of quasiNewton techniques from unconstrained to constrained optimization via Sequential Quadratic Programming (SQP) presents several difficulties. Among these are the possible inconsistency, away from the solution, of first order approximations to the constraints, resulting in infeasibi ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
. Extension of quasiNewton techniques from unconstrained to constrained optimization via Sequential Quadratic Programming (SQP) presents several difficulties. Among these are the possible inconsistency, away from the solution, of first order approximations to the constraints, resulting in infeasibility of the quadratic programs; and the task of selecting a suitable merit function, to induce global convergence. In the case of inequality constrained optimization, both of these difficulties disappear if the algorithm is forced to generate iterates that all satisfy the constraints, and that yield monotonically decreasing objective function values. (Feasibility of the successive iterates is in fact required in many contexts such as in realtime applications or when the objective function is not well defined outside the feasible set). It has been recently shown that this can be achieved while preserving local twostep superlinear convergence. In this note, the essential ingredients for an S...