Results 1  10
of
26
A Globally Convergent PrimalDual InteriorPoint Filter Method for Nonlinear Programming
, 2002
"... In this paper, the filter technique of Fletcher and Leyffer (1997) is used to globalize the primaldual interiorpoint algorithm for nonlinear programming, avoiding the use of merit functions and the updating of penalty parameters. The new algorithm decomposes the primaldual step obtained from the p ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
In this paper, the filter technique of Fletcher and Leyffer (1997) is used to globalize the primaldual interiorpoint algorithm for nonlinear programming, avoiding the use of merit functions and the updating of penalty parameters. The new algorithm decomposes the primaldual step obtained from the perturbed firstorder necessary conditions into a normal and a tangential step, whose sizes are controlled by a trustregion type parameter. Each entry in the filter is a pair of coordinates: one resulting from feasibility and centrality, and associated with the normal step; the other resulting from optimality (complementarity and duality), and related with the tangential step. Global convergence to firstorder critical points is proved for the new primaldual interiorpoint filter algorithm.
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
On the local behavior of an interior point method for nonlinear programming
 Numerical Analysis 1997
, 1997
"... Jorge Nocedal z We study the local convergence of a primaldual interior point method for nonlinear programming. A linearly convergent version of this algorithm has been shown in [2] to be capable of solving large and di cult nonconvex problems. But for the algorithm to reach its full potential, it ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
Jorge Nocedal z We study the local convergence of a primaldual interior point method for nonlinear programming. A linearly convergent version of this algorithm has been shown in [2] to be capable of solving large and di cult nonconvex problems. But for the algorithm to reach its full potential, it must converge rapidly to the solution. In this paper we describe how to design the algorithm so that it converges superlinearly on regular problems. Key words: constrained optimization, interior point method, largescale optimization, nonlinear programming, primal method, primaldual method, successive quadratic programming.
Feasible Interior Methods Using Slacks for Nonlinear Optimization
 Computational Optimization and Applications
, 2002
"... A slackbased feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust regio ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
A slackbased feasible interior point method is described which can be derived as a modification of infeasible methods. The modification is minor for most line search methods, but trust region methods require special attention. It is shown how the Cauchy point, which is often computed in trust region methods, must be modified so that the feasible method is effective for problems containing both equality and inequality constraints. The relationship between slackbased methods and traditional feasible methods is discussed. Numerical results showing the relative performance of feasible versus infeasible interior point methods are presented.
A Computational Study of the Homogeneous Algorithm for LargeScale Convex Optimization
, 1997
"... Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of th ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Recently the authors have proposed a homogeneous and selfdual algorithm for solving the monotone complementarity problem (MCP) [5]. The algorithm is a single phase interiorpoint type method, nevertheless it yields either an approximate optimal solution or detects a possible infeasibility of the problem. In this paper we specialize the algorithm to the solution of general smooth convex optimization problems that also possess nonlinear inequality constraints and free variables. We discuss an implementation of the algorithm for largescale sparse convex optimization. Moreover, we present computational results for solving quadratically constrained quadratic programming and geometric programming problems, where some of the problems contain more than 100,000 constraints and variables. The results indicate that the proposed algorithm is also practically efficient. Department of Management, Odense University, Campusvej 55, DK5230 Odense M, Denmark. Email: eda@busieco.ou.dk y ...
A feasible BFGS interior point algorithm for solving strongly convex minimization problems
 SIAM J. OPTIM
, 2000
"... We propose a BFGS primaldual interior point method for minimizing a convex function on a convex set defined by equality and inequality constraints. The algorithm generates feasible iterates and consists in computing approximate solutions of the optimality conditions perturbed by a sequence of posit ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We propose a BFGS primaldual interior point method for minimizing a convex function on a convex set defined by equality and inequality constraints. The algorithm generates feasible iterates and consists in computing approximate solutions of the optimality conditions perturbed by a sequence of positive parameters µ converging to zero. We prove that it converges qsuperlinearly for each fixed µ. We also show that it is globally convergent to the analytic center of the primaldual optimalset when µ tends to 0 and strict complementarity holds.
Superlinear and Quadratic Convergence of AffineScaling InteriorPoint Newton Methods for Problems with Simple Bounds without Strict Complementarity Assumption
, 1998
"... A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong... ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong...
Superlinear Convergence of PrimalDual Interior Point Algorithms for Nonlinear Programming
, 2000
"... The local convergence properties of a class of primaldual interior point methods are analyzed. These methods are designed to minimize a nonlinear, nonconvex, objective function subject to linear equality constraints and general inequalities. They involve an inner iteration in which the logbarrier ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
The local convergence properties of a class of primaldual interior point methods are analyzed. These methods are designed to minimize a nonlinear, nonconvex, objective function subject to linear equality constraints and general inequalities. They involve an inner iteration in which the logbarrier merit function is approximately minimized subject to satisfying the linear equality constraints, and an outer iteration that species both the decrease in the barrier parameter and the level of accuracy for the inner minimization. It is shown that, asymptotically, for each value of the barrier parameter, solving a single primaldual linear system is enough to produce an iterate that already matches the barrier subproblem accuracy requirements. The asymptotic rate of convergence of the resulting algorithm is Qsuperlinear and may be chosen arbitrarily close to quadratic. Furthermore, this rate applies componentwise. These results hold in particular for the method described by Conn, Gould, Orb...
A Convergent Infeasible InteriorPoint TrustRegion Method For Constrained Minimization
 SIAM Journal on Optimization
, 1999
"... We study an infeasible interiorpoint trustregion method for constrained minimization. This method uses a logarithmicbarrier function for the slack variables and updates the slack variables using secondorder correction. We show that if a certain set containing the iterates is bounded and the orig ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We study an infeasible interiorpoint trustregion method for constrained minimization. This method uses a logarithmicbarrier function for the slack variables and updates the slack variables using secondorder correction. We show that if a certain set containing the iterates is bounded and the origin is not in the convex hull of the nearly active constraint gradients everywhere on this set, then any cluster point of the iterates is a 1storder stationary point. If the cluster point satisfies an additional assumption (which holds when the constraints are linear or when the cluster point satisfies strict complementarity and a local error bound holds), then it is a 2ndorder stationary point. Key words. Nonlinear program, logarithmicbarrier function, interiorpoint method, trustregion strategy, 1st and 2ndorder stationary points, semidefinite programming. 1 Introduction We consider the nonlinear program with inequality constraints: minimize f(x) subject to g(x) = [g 1 (x) g m (...
Local Convergence Of A PrimalDual Method For Degenerate Nonlinear Programming
 MATHEMATICS AND COMPUTER SCIENCE DIVISION, ARGONNE NATIONAL LABORATORY, ARGONNE
, 2000
"... In recent work, the local convergence behavior of pathfollowing interiorpoint methods and sequential quadratic programming methods for nonlinear programming has been investigated for the case in which the active constraint gradients may fail to be linearly independent at the solution, but the Mang ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
In recent work, the local convergence behavior of pathfollowing interiorpoint methods and sequential quadratic programming methods for nonlinear programming has been investigated for the case in which the active constraint gradients may fail to be linearly independent at the solution, but the MangasarianFromovitz constraint qualication is satisfied. In this paper, we describe a stabilization of the primaldual interiorpoint approach that ensures rapid local convergence under these conditions without enforcing the usual centrality condition associated with pathfollowing methods. The stabilization takes the form of perturbations to the coefficient matrix in the step equations that vanish as the iterates converge to the solution.