Results 1  10
of
28
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 145 (4 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
Tits, User’s Guide for CFSQP Version 2.5: A C Code for Solving (Large Scale) Constrained Nonlinear (Minimax) Optimization Problems, Generating Iterates Satisfying All Inequality Constraints
, 1997
"... ..."
(Show Context)
On the convergence of a sequential quadratic programming method with an augmented Lagrangian line search function
 Math. Operstionsforschung und Statistik, Ser. Optimization
, 1983
"... Sequential quadratic programming (SQP) methods are widely used for solving practical optimization problems, especially in structural mechanics. The general structure of SQP methods is briefly introduced and it is shown how these methods can be adapted to distributed computing. However, SQP methods a ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
(Show Context)
Sequential quadratic programming (SQP) methods are widely used for solving practical optimization problems, especially in structural mechanics. The general structure of SQP methods is briefly introduced and it is shown how these methods can be adapted to distributed computing. However, SQP methods are sensitive subject to errors in function and gradient evaluations. Typically they break down with an error message reporting that the line search cannot be terminated successfully. In these cases, a new nonmonotone line search is activated. In case of noisy function values, a drastic improvement of the performance is achieved compared to the version with monotone line search. Numerical results are presented for a set of more than 300 standard test examples.
A nonmonotone line search technique and its application to unconstrained optimization
 SIAM J. Optim
, 2004
"... Abstract. A new nonmonotone line search algorithm is proposed and analyzed. In our scheme, we require that an average of the successive function values decreases, while the traditional nonmonotone approach of Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707–716] requires tha ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
(Show Context)
Abstract. A new nonmonotone line search algorithm is proposed and analyzed. In our scheme, we require that an average of the successive function values decreases, while the traditional nonmonotone approach of Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707–716] requires that a maximum of recent function values decreases. We prove global convergence for nonconvex, smooth functions, and Rlinear convergence for strongly convex functions. For the LBFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonmonotone scheme.
User’s Guide for FFSQP Version 3.7 : A Fortran Code for Solving Optimization Programs
 Possibly Minimax, with General Inequality Constraints and Linear Equality Constraints, Generating Feasible Iterates”, Institute for Systems Research, University of Maryland,Technical Report SRCTR92107r5
, 1997
"... ..."
(Show Context)
Nonmonotone Line Search for Minimax Problems
, 1993
"... . It was recently shown that, in the solution of smooth constrained optimization problems by sequential quadratic programming (SQP), the Maratos effect can be prevented by means of a certain nonmonotone (more precisely, threestep or fourstep monotone) line search. Using a well known transformation ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
. It was recently shown that, in the solution of smooth constrained optimization problems by sequential quadratic programming (SQP), the Maratos effect can be prevented by means of a certain nonmonotone (more precisely, threestep or fourstep monotone) line search. Using a well known transformation, this scheme can be readily extended to the case of minimax problems. It turns out however that, due to the structure of these problems, one can use a simpler scheme. Such a scheme is proposed and analyzed in this paper. Numerical experiments indicate a significant advantage of the proposed line search over the (monotone) Armijo search. Key words. Minimax problems, SQP direction, Maratos effect, Superlinear convergence. 1 This research was supported in part by NSF's Engineering Research Centers Program No. NSFDCDR88 03012, by NSF grant No. DMC8815996 and by a grant from the Westinghouse Corporation. 2 To whom the correspondence should be addressed. 1. Introduction. Consider the "m...
A New Technique For Inconsistent QP Problems In The SQP Method
 University at Darmstadt, Department of Mathematics, preprint 1561, Darmstadt
, 1993
"... Successful treatment of inconsistent QP problems is of major importance in the SQP method, since such occur quite often even for well behaved nonlinear programming problems. This paper presents a new technique for regularizing inconsistent QP problems, which compromises in its properties between the ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Successful treatment of inconsistent QP problems is of major importance in the SQP method, since such occur quite often even for well behaved nonlinear programming problems. This paper presents a new technique for regularizing inconsistent QP problems, which compromises in its properties between the simple technique of Pantoja and Mayne [34] and the highly successful, but expensive one of Tone [44]. Global convergence of a corresponding algorithm is shown under reasonable weak conditions. Numerical results are reported which show that this technique, combined with a special method for the case of regular subproblems, is quite competitive to highly appreciated established ones. Key words: sequential quadratic programming, SQP method, nonlinear programming AMS(MOS) subject classification: primary 90C30, secondary 65K05 1 NOTATION Superscripts on a vector denote elements of sequences. All vectors are column vectors. For a vectorvalued function g rg(x) denotes the transposed Jacobian eval...
Nonlinear Equality Constraints in Feasible Sequential Quadratic Programming
 Optimization Methods and Software
, 1996
"... this paper we investigate incorporating the Mayne and Polak scheme, modified along the lines of this second alternative, into the algorithm of [9]. The balance of this paper is organized as follows. In Section 2 we present the algorithm (a few of the details are deferred to Section 4 in order to avo ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
(Show Context)
this paper we investigate incorporating the Mayne and Polak scheme, modified along the lines of this second alternative, into the algorithm of [9]. The balance of this paper is organized as follows. In Section 2 we present the algorithm (a few of the details are deferred to Section 4 in order to avoid any loss of continuity). Section 3 is devoted to establishing convergence. In Section 4 we discuss an implementation and some numerical results. Finally, we offer some concluding remarks in Section 5. 2 ALGORITHM Let \Omega
SPG: Software for ConvexConstrained Optimization
, 2001
"... this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial deriva ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
(Show Context)
this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial derivatives on an open set that contains Users of the software must supply subroutines to compute the function f(x), the gradient rf(x) and projections of an arbitrary point x onto Information about the Hessian matrix is not required and the storage requirements are minimal. Therefore, the algorithm is appropriate for largescale convexconstrained optimization problems with aordable projections onto the feasible set. Notice that the algorithm is also suitable for unconstrained optimization problems simply by setting = IR n