Results 1  10
of
54
Nonmonotone spectral projected gradient methods on convex sets
 SIAM Journal on Optimization
, 2000
"... Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone lin ..."
Abstract

Cited by 147 (28 self)
 Add to MetaCart
(Show Context)
Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone line search. In particular, the nonmonotone strategy is combined with the spectral gradient choice of steplength to accelerate the convergence process. In addition to the classical projected gradient nonlinear path, the feasible spectral projected gradient is used as a search direction to avoid additional trial projections during the onedimensional search process. Convergence properties and extensive numerical results are presented.
A Trust Region Framework For Managing The Use Of Approximation Models In Optimization
 STRUCTURAL OPTIMIZATION
, 1998
"... This paper presents an analytically robust, globally convergent approach to managing the use of approximation models of various fidelity in optimization. By robust global behavior we mean the mathematical assurance that the iterates produced by the optimization algorithm, started at an arbitrary ini ..."
Abstract

Cited by 100 (9 self)
 Add to MetaCart
This paper presents an analytically robust, globally convergent approach to managing the use of approximation models of various fidelity in optimization. By robust global behavior we mean the mathematical assurance that the iterates produced by the optimization algorithm, started at an arbitrary initial iterate, will converge to a stationary point or local optimizer for the original problem. The approach we present is based on the trust region idea from nonlinear programming and is shown to be provably convergent to a solution of the original highfidelity problem. The proposed method for managing approximations in engineering optimization suggests ways to decide when the fidelity, and thus the cost, of the approximations might be fruitfully increased or decreased in the course of the optimization iterations. The approach is quite general. We make no assumptions on the structure of the original problem, in particular, no assumptions of convexity and separability, and place only mild ...
Newton's Method For Large BoundConstrained Optimization Problems
 SIAM JOURNAL ON OPTIMIZATION
, 1998
"... We analyze a trust region version of Newton's method for boundconstrained problems. Our approach relies on the geometry of the feasible set, not on the particular representation in terms of constraints. The convergence theory holds for linearlyconstrained problems, and yields global and super ..."
Abstract

Cited by 82 (4 self)
 Add to MetaCart
We analyze a trust region version of Newton's method for boundconstrained problems. Our approach relies on the geometry of the feasible set, not on the particular representation in terms of constraints. The convergence theory holds for linearlyconstrained problems, and yields global and superlinear convergence without assuming neither strict complementarity nor linear independence of the active constraints. We also show that the convergence theory leads to an efficient implementation for large boundconstrained problems.
LargeScale ActiveSet BoxConstrained Optimization Method with Spectral Projected Gradients
 Computational Optimization and Applications
, 2001
"... A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradien ..."
Abstract

Cited by 55 (10 self)
 Add to MetaCart
A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradient) for dropping constraints from the working set. Global convergence is proved. A computer implementation is fully described and a numerical comparison assesses the reliability of the new algorithm. Keywords: Boxconstrained minimization, numerical methods, activeset strategies, Spectral Projected Gradient. 1
Fast sweeping methods for static hamiltonjacobi equations
 Society for Industrial and Applied Mathematics
, 2005
"... Abstract. We propose a new sweeping algorithm which discretizes the Legendre transform of the numerical Hamiltonian using an explicit formula. This formula yields the numerical solution at a grid point using only its immediate neighboring grid values and is easy to implement numerically. The minimiz ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a new sweeping algorithm which discretizes the Legendre transform of the numerical Hamiltonian using an explicit formula. This formula yields the numerical solution at a grid point using only its immediate neighboring grid values and is easy to implement numerically. The minimization that is related to the Legendre transform in our sweeping scheme can either be solved analytically or numerically. We illustrate the efficiency and accuracy approach with several numerical examples in two and three dimensions.
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
A new active set algorithm for box constrained optimization
 SIAM Journal on Optimization
"... ..."
(Show Context)
An Adaptive Algorithm for Bound Constrained Quadratic Minimization
, 1997
"... A general algorithm for minimizing a quadratic function with bounds on the variables is presented. The new algorithm can use different unconstrained minimization techniques on different faces. At every face, the minimization technique can be chosen according to he structure of the Hessian and the di ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
A general algorithm for minimizing a quadratic function with bounds on the variables is presented. The new algorithm can use different unconstrained minimization techniques on different faces. At every face, the minimization technique can be chosen according to he structure of the Hessian and the dimension of the face. The strategy for leaving the face is based on a simple scheme that exploits the properties of the "chopped gradient" introduced by Friedlander and Mart'inez in 1989. This strategy guarantees global convergence even in the presence of dual degeneracy, and finite identification in the nondegenerate case. A slight modification of the algorithm satisfies, in addition, an identification property in the case of dual degeneracy. Numerical experiments combining this new strategy with conjugate gradients, gradient with retards and direct solvers are presented. Key words. Quadratic programming, conjugate gradients, gradient with retards, active set methods, sparse Cholesky factor...
NonMonotone TrustRegion Methods for BoundConstrained Semismooth Equations with Applications to Nonlinear Mixed Complementarity Problems
, 1999
"... We develop and analyze a class of trustregion methods for boundconstrained semismooth systems of equations. The algorithm is based on a simply constrained differentiable minimization reformulation. Our global convergence results are developed in a very general setting that allows for nonmonotoni ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We develop and analyze a class of trustregion methods for boundconstrained semismooth systems of equations. The algorithm is based on a simply constrained differentiable minimization reformulation. Our global convergence results are developed in a very general setting that allows for nonmonotonicity of the function values at subsequent iterates. We propose a way of computing trial steps by a semismooth Newtonlike method that is augmented by a projection onto the feasible set. Under a DennisMoretype condition we prove that close to a BDregular solution the trustregion algorithm turns into this projected Newton method, which is shown to converge locally qsuperlinearly or quadratically, respectively, depending on the quality of the approximate BDsubdifferentials used. As an important application we discuss in detail how the developed algorithm can be used to solve nonlinear mixed complementarity problems (MCPs). Hereby, the MCP is converted into a boundconstrained semismooth...
Practical activeset Euclidian trustregion method with spectral projected gradients for boundconstrained minimization, Optimization 54
 SIAM Journalon Optimization
, 2005
"... A practical activeset method for boundconstrained minimization is introduced. Within the current face the classical Euclidian trustregion method is employed. Spectral projected gradient directions are used to abandon faces. Numerical results are presented. Key words: Boundconstrained optimizatio ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
(Show Context)
A practical activeset method for boundconstrained minimization is introduced. Within the current face the classical Euclidian trustregion method is employed. Spectral projected gradient directions are used to abandon faces. Numerical results are presented. Key words: Boundconstrained optimization, projected gradient, spectral gradient, trust regions. 1