Results 1  10
of
54
On the implementation of an interiorpoint filter linesearch algorithm for largescale nonlinear programming
 Mathematical Programming
, 2006
"... We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration pha ..."
Abstract

Cited by 144 (5 self)
 Add to MetaCart
(Show Context)
We present a primaldual interiorpoint algorithm with a filter linesearch method for nonlinear programming. Local and global convergence properties of this method were analyzed in previous work. Here we provide a comprehensive description of the algorithm, including the feasibility restoration phase for the filter method, secondorder corrections, and inertia correction of the KKT matrix. Heuristics are also considered that allow faster performance. This method has been implemented in the IPOPT code, which we demonstrate in a detailed numerical study based on 954 problems from the CUTEr test set. An evaluation is made of several linesearch options, and a comparison is provided with two stateoftheart interiorpoint codes for nonlinear programming.
Sequential Quadratic Programming
, 1995
"... this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can ..."
Abstract

Cited by 121 (3 self)
 Add to MetaCart
this paper we examine the underlying ideas of the SQP method and the theory that establishes it as a framework from which effective algorithms can
An interior point algorithm for large scale nonlinear programming
 SIAM Journal on Optimization
, 1999
"... The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of t ..."
Abstract

Cited by 78 (18 self)
 Add to MetaCart
The design and implementation of a new algorithm for solving large nonlinear programming problems is described. It follows a barrier approach that employs sequential quadratic programming and trust regions to solve the subproblems occurring in the iteration. Both primal and primaldual versions of the algorithm are developed, and their performance is illustrated in a set of numerical tests. Key words: constrained optimization, interior point method, largescale optimization, nonlinear programming, primal method, primaldual method, successive quadratic programming, trust region method.
Algorithms For Complementarity Problems And Generalized Equations
, 1995
"... Recent improvements in the capabilities of complementarity solvers have led to an increased interest in using the complementarity problem framework to address practical problems arising in mathematical programming, economics, engineering, and the sciences. As a result, increasingly more difficult pr ..."
Abstract

Cited by 45 (5 self)
 Add to MetaCart
Recent improvements in the capabilities of complementarity solvers have led to an increased interest in using the complementarity problem framework to address practical problems arising in mathematical programming, economics, engineering, and the sciences. As a result, increasingly more difficult problems are being proposed that exceed the capabilities of even the best algorithms currently available. There is, therefore, an immediate need to improve the capabilities of complementarity solvers. This thesis addresses this need in two significant ways. First, the thesis proposes and develops a proximal perturbation strategy that enhances the robustness of Newtonbased complementarity solvers. This strategy enables algorithms to reliably find solutions even for problems whose natural merit functions have strict local minima that are not solutions. Based upon this strategy, three new algorithms are proposed for solving nonlinear mixed complementarity problems that represent a significant improvement in robustness over previous algorithms. These algorithms have local Qquadratic convergence behavior, yet depend only on a pseudomonotonicity assumption to achieve global convergence from arbitrary starting points. Using the MCPLIB and GAMSLIB test libraries, we perform extensive computational tests that demonstrate the effectiveness of these algorithms on realistic problems. Second, the thesis extends some previously existing algorithms to solve more general problem classes. Specifically, the NE/SQP method of Pang & Gabriel (1993), the semismooth equations approach of De Luca, Facchinei & Kanz...
A reduced Hessian method for largescale constrained optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1995
"... ..."
Trust Region Algorithms For Constrained Optimization
 Math. Prog
, 1990
"... We review the main techniques used in trust region algorithms for nonlinear constrained optimization. 1. Trust Region Idea Constrained optimization is to minimize a function subject to finitely many algebraic equation and inequality conditions. It has the following form min x2! n f(x) (1.1) subj ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
We review the main techniques used in trust region algorithms for nonlinear constrained optimization. 1. Trust Region Idea Constrained optimization is to minimize a function subject to finitely many algebraic equation and inequality conditions. It has the following form min x2! n f(x) (1.1) subject to c i (x) = 0; i = 1; 2; : : : ; m e ; (1.2) c i (x) 0; i = m e + 1; : : : ; m; (1.3) where f(x) and c i (x) (i = 1; : : : ; m) are real functions defined in ! n , and m m e are two nonnegative integers. Numerical methods for nonlinear optimization problems can be grouped as two types. One are line search methods and the other are trust region algorithms. Line search algorithms at each iteration use a direction to carry a line search. The direction is called the search direction, which is normally computed by solving a subproblem that approximates the original problem near the current iterate. A line search means to search for a new point along the search direction. For example, ...
Complementarity Problems in GAMS and the PATH Solver
 Journal of Economic Dynamics and Control
, 1998
"... A fundamental mathematical problem is to find a solution to a square system of nonlinear equations. There are many methods to approach this problem, the most famous of which is Newton's method. In this paper, we describe a generalization of this problem, the complementarity problem. We show how ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
(Show Context)
A fundamental mathematical problem is to find a solution to a square system of nonlinear equations. There are many methods to approach this problem, the most famous of which is Newton's method. In this paper, we describe a generalization of this problem, the complementarity problem. We show how such problems are modeled within the GAMS modeling language and provide details about the PATH solver, a generalization of Newton's method, for finding a solution. While the modeling format is applicable in many disciplines, we draw the examples in this paper from an economic background. Finally, some extensions of the modeling format and the solver are described. Keywords: Complementarity problems, variational inequalities, algorithms AMS Classification: 90C33,65K10 This paper is an extended version of a talk presented at CEFES '98 (Computation in Economics, Finance and Engineering: Economic Systems) in Cambridge, England in July 1998 This material is based on research supported by Nationa...
A Pathsearch Damped Newton Method for Computing General Equilibria
 Annals of Operations Research
, 1994
"... Computable general equilibrium models and other types of variational inequalities play a key role in computational economics. This paper describes the design and implementation of a pathsearchdamped Newton method for solving such problems. Our algorithm improves on the typical Newton method (which ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
Computable general equilibrium models and other types of variational inequalities play a key role in computational economics. This paper describes the design and implementation of a pathsearchdamped Newton method for solving such problems. Our algorithm improves on the typical Newton method (which generates and solves a sequence of LCP's) in both speed and robustness. The underlying complementarity problem is reformulated as a normal map so that standard algorithmic enchancements of Newton's method for solving nonlinear equations can be easily applied. The solver is implemented as a GAMS subsystem, using an interface library developed for this purpose. Computational results obtained from a number of test problems arising in economics are given.
The Semismooth Algorithm for Large Scale Complementarity Problems
, 1999
"... Complementarity solvers are continually being challenged by modelers demanding improved reliability and scalability. Building upon a strong theoretical background, the semismooth algorithm has the potential to meet both of these requirements. We briefly discuss relevant theory associated with th ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Complementarity solvers are continually being challenged by modelers demanding improved reliability and scalability. Building upon a strong theoretical background, the semismooth algorithm has the potential to meet both of these requirements. We briefly discuss relevant theory associated with the algorithm and describe a sophisticated implementation in detail. Particular emphasis is given to robust methods for dealing with singularities in the linear system and to large scale issues. Results on the MCPLIB test suite indicate that the code is robust and has the potential to solve very large problems.
Nonmonotone Line Search for Minimax Problems
, 1993
"... . It was recently shown that, in the solution of smooth constrained optimization problems by sequential quadratic programming (SQP), the Maratos effect can be prevented by means of a certain nonmonotone (more precisely, threestep or fourstep monotone) line search. Using a well known transformation ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
. It was recently shown that, in the solution of smooth constrained optimization problems by sequential quadratic programming (SQP), the Maratos effect can be prevented by means of a certain nonmonotone (more precisely, threestep or fourstep monotone) line search. Using a well known transformation, this scheme can be readily extended to the case of minimax problems. It turns out however that, due to the structure of these problems, one can use a simpler scheme. Such a scheme is proposed and analyzed in this paper. Numerical experiments indicate a significant advantage of the proposed line search over the (monotone) Armijo search. Key words. Minimax problems, SQP direction, Maratos effect, Superlinear convergence. 1 This research was supported in part by NSF's Engineering Research Centers Program No. NSFDCDR88 03012, by NSF grant No. DMC8815996 and by a grant from the Westinghouse Corporation. 2 To whom the correspondence should be addressed. 1. Introduction. Consider the "m...