Results 1  10
of
54
LargeScale ActiveSet BoxConstrained Optimization Method with Spectral Projected Gradients
 Computational Optimization and Applications
, 2001
"... A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradien ..."
Abstract

Cited by 59 (9 self)
 Add to MetaCart
A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradient) for dropping constraints from the working set. Global convergence is proved. A computer implementation is fully described and a numerical comparison assesses the reliability of the new algorithm. Keywords: Boxconstrained minimization, numerical methods, activeset strategies, Spectral Projected Gradient. 1
An Algorithm for Nonlinear Optimization Using Linear Programming and Equality Constrained Subproblems
, 2003
"... This paper describes an activeset algorithm for largescale nonlinear programming based on the successive linear programming method proposed by Fletcher and Sainz de la Maza [10]. The step computation is performed in two stages. In the first stage a linear program is solved to estimate the activ ..."
Abstract

Cited by 41 (12 self)
 Add to MetaCart
This paper describes an activeset algorithm for largescale nonlinear programming based on the successive linear programming method proposed by Fletcher and Sainz de la Maza [10]. The step computation is performed in two stages. In the first stage a linear program is solved to estimate the active set at the solution. The linear program is obtained by making a linear approximation to the ` 1 penalty function inside a trust region. In the second stage, an equality constrained quadratic program (EQP) is solved involving only those constraints that are active at the solution of the linear program.
PETSc users manual
 Tech. Rep. ANL95/11  Revision 2.1.5, Argonne National Laboratory
, 2004
"... This work was supported by the Mathematical, Information, and Computational Sciences ..."
Abstract

Cited by 26 (8 self)
 Add to MetaCart
This work was supported by the Mathematical, Information, and Computational Sciences
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
Optimality measures for performance profiles
 Preprint ANL/MCSP11550504, Mathematics and Computer Science Division, Argonne National Lab
, 2004
"... We examine the influence of optimality measures on the benchmarking process, and show that scaling requirements lead to a convergence test for nonlinearly constrained solvers that uses a mixture of absolute and relative error measures. We show that this convergence test is well behaved at any point ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
We examine the influence of optimality measures on the benchmarking process, and show that scaling requirements lead to a convergence test for nonlinearly constrained solvers that uses a mixture of absolute and relative error measures. We show that this convergence test is well behaved at any point where the constraints satisfy the MangasarianFromovitz constraint qualification and also avoids the explicit use of a complementarity measure. Our computational experiments explore the impact of this convergence test on the benchmarking process with performance profiles. 1
NonMonotone TrustRegion Methods for BoundConstrained Semismooth Equations with Applications to Nonlinear Mixed Complementarity Problems
, 1999
"... We develop and analyze a class of trustregion methods for boundconstrained semismooth systems of equations. The algorithm is based on a simply constrained differentiable minimization reformulation. Our global convergence results are developed in a very general setting that allows for nonmonotoni ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
We develop and analyze a class of trustregion methods for boundconstrained semismooth systems of equations. The algorithm is based on a simply constrained differentiable minimization reformulation. Our global convergence results are developed in a very general setting that allows for nonmonotonicity of the function values at subsequent iterates. We propose a way of computing trial steps by a semismooth Newtonlike method that is augmented by a projection onto the feasible set. Under a DennisMoretype condition we prove that close to a BDregular solution the trustregion algorithm turns into this projected Newton method, which is shown to converge locally qsuperlinearly or quadratically, respectively, depending on the quality of the approximate BDsubdifferentials used. As an important application we discuss in detail how the developed algorithm can be used to solve nonlinear mixed complementarity problems (MCPs). Hereby, the MCP is converted into a boundconstrained semismooth...
Mesh ShapeQuality Optimization Using the Inverse MeanRatio Metric
 Preprint ANL/MCSP11360304, Argonne National Laboratory, Argonne
, 2004
"... Meshes containing elements with bad quality can result in poorly conditioned systems of equations that must be solved when using a discretization method, such as the finiteelement method, for solving a partial differential equation. Moreover, such meshes can lead to poor accuracy in the approximate ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Meshes containing elements with bad quality can result in poorly conditioned systems of equations that must be solved when using a discretization method, such as the finiteelement method, for solving a partial differential equation. Moreover, such meshes can lead to poor accuracy in the approximate solution computed. In this paper, we present a nonlinear fractional program that relocates the vertices of a given mesh to optimize the average element shape quality as measured by the inverse meanratio metric. To solve the resulting largescale optimization problems, we apply an efficient implementation of an inexact Newton algorithm using the conjugate gradient method with a block Jacobi preconditioner to compute the direction. We show that the block Jacobi preconditioner is positive definite by proving a general theorem concerning the convexity of fractional functions, applying this result to components of the inverse meanratio metric, and showing that each block in the preconditioner is invertible. Numerical results obtained with this specialpurpose code on several test meshes are presented and used to quantify the impact on solution time and memory requirements of using a modeling language and generalpurpose algorithm to solve these problems. 1
GALAHAD, a library of threadsafe Fortran 90 Packages for LargeScale Nonlinear Optimization
, 2002
"... In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for prepro ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
In this paper, we describe the design of version 1.0 of GALAHAD, a library of Fortran 90 packages for largescale largescale nonlinear optimization. The library particularly addresses quadratic programming problems, containing both interior point and active set variants, as well as tools for preprocessing such problems prior to solution. It also contains an updated version of the venerable nonlinear programming package, LANCELOT.
Superlinear and Quadratic Convergence of AffineScaling InteriorPoint Newton Methods for Problems with Simple Bounds without Strict Complementarity Assumption
, 1998
"... A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong... ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
A class of affinescaling interiorpoint methods for bound constrained optimization problems is introduced which are locally qsuperlinear or qquadratic convergent. It is assumed that the strong...
Enriched Methods for LargeScale Unconstrained Optimization.
 Computational Optimization and Applications
, 2000
"... This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (LBFGS) and a Hessianfree Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (LBFGS) and a Hessianfree Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for LBFGS iterations. The lengths of the LBFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters. Key words: limited memory method, Hessianfree Newton method, truncated Newton method, LBFGS, conjugate gradient method, quasiNewton preconditioning. Departamento de Matem'aticas, Instituto Tecnol'ogico Aut'onomo de M'exico, R'io Hon...