Results 1  10
of
13
On the implementation of an interiorpoint filter linesearch algorithm for largescale nonlinear programming
 Mathematical Programming
, 2006
"... We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration pha ..."
Abstract

Cited by 109 (5 self)
 Add to MetaCart
We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, secondorder corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several linesearch options, and a comparison is provided with two stateoftheart interiorpoint codes for nonlinear programming.
On Augmented Lagrangian methods with general lowerlevel constraints
, 2005
"... Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. In ..."
Abstract

Cited by 59 (7 self)
 Add to MetaCart
Augmented Lagrangian methods with general lowerlevel constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems where the constraints are only of the lowerlevel type. Two methods of this class are introduced and analyzed. Inexact resolution of the lowerlevel constrained subproblems is considered. Global convergence is proved using the Constant Positive Linear Dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The reliability of the approach is tested by means of an exhaustive comparison against Lancelot. All the problems of the Cute collection are used in this comparison. Moreover, the resolution of location problems in which many constraints of the lowerlevel set are nonlinear is addressed, employing the Spectral Projected Gradient method for solving the subproblems. Problems of this type with more than 3 × 10 6 variables and 14 × 10 6 constraints are solved in this way, using moderate computer time.
An interior algorithm for nonlinear optimization that combines line search and trust region steps
 Mathematical Programming 107
, 2006
"... An interiorpoint method for nonlinear programming is presented. It enjoys the flexibility of switching between a line search method that computes steps by factoring the primaldual equations and a trust region method that uses a conjugate gradient iteration. Steps computed by direct factorization a ..."
Abstract

Cited by 31 (11 self)
 Add to MetaCart
An interiorpoint method for nonlinear programming is presented. It enjoys the flexibility of switching between a line search method that computes steps by factoring the primaldual equations and a trust region method that uses a conjugate gradient iteration. Steps computed by direct factorization are always tried first, but if they are deemed ineffective, a trust region iteration that guarantees progress toward stationarity is invoked. To demonstrate its effectiveness, the algorithm is implemented in the Knitro [6, 28] software package and is extensively tested on a wide selection of test problems. 1
GALAHAD, a library of threadsafe Fortran 90 Packages for LargeScale Nonlinear Optimization
, 2002
"... In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for prepro ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for preprocessing such problems prior to solution. It also contains an updated version of the venerable nonlinear programming package, LANCELOT.
Componentwise Fast Convergence in the Solution of FullRank Systems of Nonlinear Equations
, 2000
"... The asymptotic convergence of parameterized variants of Newton's method for the solution of nonlinear systems of equations is considered. The original system is perturbed by a term involving the variables and a scalar parameter which is driven to zero as the iteration proceeds. The exact local solut ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
The asymptotic convergence of parameterized variants of Newton's method for the solution of nonlinear systems of equations is considered. The original system is perturbed by a term involving the variables and a scalar parameter which is driven to zero as the iteration proceeds. The exact local solutions to the perturbed systems then form a dierentiable path leading to a solution of the original system, the scalar parameter determining the progress along the path. A homotopytype algorithm, which involves an inner iteration in which the perturbed systems are approximately solved, is outlined. It is shown that asymptotically, a single linear system is solved per update of the scalar parameter. It turns out that a componentwise Qsuperlinear rate may be attained under standard assumptions, and that this rate may be made arbitrarily close to quadratic. Numerical experiments illustrate the results and we discuss the relationships that this method shares with interior methods in constrained...
On secondorder optimality conditions for nonlinear programming
 Optimization
"... A new SecondOrder condition is given, which depends on a weak constant rank constraint requirement. We show that practical and publicly available algorithms (www.ime.usp.br/∼egbirgin/tango) of Augmented Lagrangian type converge, after slight modifications, to stationary points defined by the new co ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
A new SecondOrder condition is given, which depends on a weak constant rank constraint requirement. We show that practical and publicly available algorithms (www.ime.usp.br/∼egbirgin/tango) of Augmented Lagrangian type converge, after slight modifications, to stationary points defined by the new condition.
Some Reflections on the Current State of ActiveSet and InteriorPoint Methods for Constrained Optimization
, 2003
"... We reect on the current state of activeset and interiorpoint methods for convex and nonconvex constrained optimization. We voice some concerns about current SQP methods. ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We reect on the current state of activeset and interiorpoint methods for convex and nonconvex constrained optimization. We voice some concerns about current SQP methods.
Numerical Methods for LargeScale NonConvex Quadratic Programming
, 2001
"... We consider numerical methods for finding (weak) secondorder critical points for largescale nonconvex quadratic programming problems. We describe two new methods. The first is of the activeset variety. Although convergent from any starting point, it is intended primarily for the case where a goo ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We consider numerical methods for finding (weak) secondorder critical points for largescale nonconvex quadratic programming problems. We describe two new methods. The first is of the activeset variety. Although convergent from any starting point, it is intended primarily for the case where a good estimate of the optimal active set can be predicted. The second is an interiorpoint trustregion type, and has proved capable of solving problems involving up to half a million unknowns and constraints. The solution of a key equality constrained subproblem, common to both methods, is described. The results of comparative tests on a large set of convex and nonconvex quadratic programming examples are given.
RALTR2002001 Preprocessing for quadratic programming
 Math. Programming
"... Techniques for the preprocessing of (notnecessarily convex) quadratic programs are discussed. Most of the procedures extend known ones from the linear to quadratic cases, but a few new preprocessing techniques are introduced. The implementation aspects are also discussed. Numerical results are nal ..."
Abstract
 Add to MetaCart
Techniques for the preprocessing of (notnecessarily convex) quadratic programs are discussed. Most of the procedures extend known ones from the linear to quadratic cases, but a few new preprocessing techniques are introduced. The implementation aspects are also discussed. Numerical results are nally presented to indicate the potential of the resulting code, both for linear and quadratic problems. The impact of insisting that bounds of the variables in the reduced problem be as tight as possible rather than allowing some slack in these bounds is also shown to be numerically signi cant.
Global Convergence of PrimalDual Methods for Nonlinear Programming
, 2008
"... We propose a new globalization strategy for primaldual interiorpoint methods in nonlinear programming that relaxes the requirement of closely following the central path and lends itself to dynamic updates of the barrier parameter. The latter promote better synchonization between the barrier param ..."
Abstract
 Add to MetaCart
We propose a new globalization strategy for primaldual interiorpoint methods in nonlinear programming that relaxes the requirement of closely following the central path and lends itself to dynamic updates of the barrier parameter. The latter promote better synchonization between the barrier parameter and the optimality residual, and increase robustness. Global convergence is proved under mild assumptions. We show that the unit Newton step is asymptotically accepted and that linear or superlinear convergence occurs when the barrier parameter goes to zero linearly or superlinearly. Numerical experiments illustrate our results.