Results 1  10
of
26
On the Implementation of an InteriorPoint Filter LineSearch Algorithm for LargeScale Nonlinear Programming
, 2004
"... We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration ph ..."
Abstract

Cited by 283 (6 self)
 Add to MetaCart
(Show Context)
We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, secondorder corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several linesearch options, and a comparison is provided with two stateoftheart interiorpoint codes for nonlinear programming.
CUTEr (and SifDec), a constrained and unconstrained testing environment, revisited
 ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE
, 2001
"... The initial release of CUTE, a widely used testing environment for optimization software was described in [2]. The latest version, now known as CUTEr is presented. New features include reorganisation of the environment to allow simultaneous multiplatform installation, new tools for, and interface ..."
Abstract

Cited by 86 (8 self)
 Add to MetaCart
(Show Context)
The initial release of CUTE, a widely used testing environment for optimization software was described in [2]. The latest version, now known as CUTEr is presented. New features include reorganisation of the environment to allow simultaneous multiplatform installation, new tools for, and interfaces to, optimization packages, and a considerably simplified and entirely automated installation procedure for unix systems. The SIF decoder, which used to be a part of CUTE, has become a separate tool, easily callable by various packages. It features simple extensions to the SIF test problem format and the generation of files suited to automatic differentiation packages.
On Augmented Lagrangian methods with general lowerlevel constraints
, 2005
"... Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. In ..."
Abstract

Cited by 80 (7 self)
 Add to MetaCart
(Show Context)
Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. Inexact resolution of the lowerlevel constrained subproblems is considered. Global convergence is proved using the Constant Positive Linear Dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The reliability of the approach is tested by means of an exhaustive comparison against Lancelot. All the problems of the Cute collection are used in this comparison. Moreover, the resolution of location problems in which many constraints of the lowerlevel set are nonlinear is addressed, employing the Spectral Projected Gradient method for solving the subproblems. Problems of this type with more than 3 × 10 6 variables and 14 × 10 6 constraints are solved in this way, using moderate computer time.
On the solution of equality constrained quadratic programming problems arising . . .
, 1998
"... ..."
(Show Context)
Disciplined convex programming
 Global Optimization: From Theory to Implementation, Nonconvex Optimization and Its Application Series
, 2006
"... ..."
Feasible Interior Methods Using Slacks for Nonlinear Optimization
 Computational Optimization and Applications
, 2002
"... A slackbased feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust regio ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
(Show Context)
A slackbased feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust region methods, must be modified so that the feasible method is effective for problems containing both equality and inequality constraints. The relationship between slackbased methods and traditional feasible methods is discussed. Numerical results showing the relative performance of feasible versus infeasible interior point methods are presented.
GALAHAD, a library of threadsafe Fortran 90 Packages for LargeScale Nonlinear Optimization
, 2002
"... In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for prepro ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for preprocessing such problems prior to solution. It also contains an updated version of the venerable nonlinear programming package, LANCELOT.
Superlinear Convergence of PrimalDual Interior Point Algorithms for Nonlinear Programming
, 2000
"... The local convergence properties of a class of primaldual interior point methods are analyzed. These methods are designed to minimize a nonlinear, nonconvex, objective function subject to linear equality constraints and general inequalities. They involve an inner iteration in which the logbarrier ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
The local convergence properties of a class of primaldual interior point methods are analyzed. These methods are designed to minimize a nonlinear, nonconvex, objective function subject to linear equality constraints and general inequalities. They involve an inner iteration in which the logbarrier merit function is approximately minimized subject to satisfying the linear equality constraints, and an outer iteration that species both the decrease in the barrier parameter and the level of accuracy for the inner minimization. It is shown that, asymptotically, for each value of the barrier parameter, solving a single primaldual linear system is enough to produce an iterate that already matches the barrier subproblem accuracy requirements. The asymptotic rate of convergence of the resulting algorithm is Qsuperlinear and may be chosen arbitrarily close to quadratic. Furthermore, this rate applies componentwise. These results hold in particular for the method described by Conn, Gould, Orb...
A Second Derivative SQP Method: Local Convergence 30 Practical Issues
 SIAM Journal of Optimization
"... results for a secondderivative SQP method for minimizing the exact ℓ1merit function for a fixed value of the penalty parameter. To establish this result, we used the properties of the socalled Cauchy step, which was itself computed from the socalled predictor step. In addition, we allowed for th ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
(Show Context)
results for a secondderivative SQP method for minimizing the exact ℓ1merit function for a fixed value of the penalty parameter. To establish this result, we used the properties of the socalled Cauchy step, which was itself computed from the socalled predictor step. In addition, we allowed for the computation of a variety of (optional) SQP steps that were intended to improve the efficiency of the algorithm. Although we established global convergence of the algorithm, we did not discuss certain aspects that are critical when developing software capable of solving general optimization problems. In particular, we must have strategies for updating the penalty parameter and better techniques for defining the positivedefinite matrix Bk used in computing the predictor step. In this paper we address both of these issues. We consider two techniques for defining the positivedefinite matrix Bk—a simple diagonal approximation and a more sophisticated limitedmemory BFGS update. We also analyze a strategy for updating the penalty parameter based on approximately minimizing the ℓ1penalty function over a sequence of increasing values of the penalty parameter. Algorithms based on exact penalty functions have certain desirable properties. To be practical, however, these algorithms must be guaranteed to avoid the socalled Maratos effect. We show that a nonmonotone variant of our algorithm avoids this phenomenon and, therefore, results in asymptotically superlinear local convergence; this is verified by preliminary numerical results on the Hock and Shittkowski test set. Key words. Nonlinear programming, nonlinear inequality constraints, sequential quadratic programming, ℓ1penalty function, nonsmooth optimization AMS subject classifications. 49J52, 49M37, 65F22, 65K05, 90C26, 90C30, 90C55 1. Introduction. In [19]
Improving ultimate convergence of an Augmented Lagrangian method
, 2007
"... Optimization methods that employ the classical PowellHestenesRockafellar Augmented Lagrangian are useful tools for solving Nonlinear Programming problems. Their reputation decreased in the last ten years due to the comparative success of InteriorPoint Newtonian algorithms, which are asymptoticall ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Optimization methods that employ the classical PowellHestenesRockafellar Augmented Lagrangian are useful tools for solving Nonlinear Programming problems. Their reputation decreased in the last ten years due to the comparative success of InteriorPoint Newtonian algorithms, which are asymptotically faster. In the present research a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its “pure” counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the Interior Point method is replaced by the Newtonian resolution of a KKT system identified by the Augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page: