Results 1 
9 of
9
LargeScale ActiveSet BoxConstrained Optimization Method with Spectral Projected Gradients
 Computational Optimization and Applications
, 2001
"... A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradien ..."
Abstract

Cited by 59 (9 self)
 Add to MetaCart
A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradient) for dropping constraints from the working set. Global convergence is proved. A computer implementation is fully described and a numerical comparison assesses the reliability of the new algorithm. Keywords: Boxconstrained minimization, numerical methods, activeset strategies, Spectral Projected Gradient. 1
Gradient Method With Retards And Generalizations
 SIAM Journal on Numerical Analysis
, 1999
"... . A generalization of the steepest descent and other methods for solving a large scale symmetric positive definitive system Ax = b is presented. Given a positive integer m, the new iteration is given by x
Abstract

Cited by 29 (5 self)
 Add to MetaCart
.<F3.81e+05> A generalization of the steepest descent and other methods for solving a large scale symmetric positive definitive system<F3.441e+05> Ax<F3.81e+05> =<F3.441e+05> b<F3.81e+05> is presented. Given a positive integer<F3.441e+05><F3.81e+05> m, the new iteration is given by<F3.441e+05> x<F2.521e+05><F2.733e+05> k+1<F3.81e+05> =<F3.441e+05> x<F2.521e+05> k<F3.59e+05><F3.441e+05><F3.81e+05><F3.441e+05> #(x<F2.521e+05><F2.733e+05><F2.521e+05><F2.733e+05> #(k)<F3.81e+05><F3.441e+05> )(Ax<F2.521e+05> k<F3.59e+05> <F3.441e+05><F3.81e+05> b), where<F3.441e+05><F3.81e+05><F3.441e+05> #(x<F2.521e+05><F2.733e+05><F2.521e+05><F2.733e+05> #(k)<F3.81e+05> ) is the steepest descent step at a previous iteration<F3.441e+05><F3.81e+05><F3.441e+05><F3.81e+05> #(k)<F3.59e+05> #<F3.441e+05> {k,<F3.59e+05><F3.81e+05><F3.441e+05> k1, . . . ,<F3.81e+05><F3.59e+05><F3.81e+05><F3.441e+05> max{0,<F3.59e+05><F3.441e+05><F3.59e+05><F3.81e+05> km}}. The global convergence to the solution of the p...
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
On the BarzilaiBorwein method
, 2001
"... A review is given of the underlying theory and recent developments in regard to the BarzilaiBorwein steepest descent method for large scale unconstrained optimization. One aim is to assess why the method seems to be comparable in practical eciency to conjugate gradient methods. The importance of ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
A review is given of the underlying theory and recent developments in regard to the BarzilaiBorwein steepest descent method for large scale unconstrained optimization. One aim is to assess why the method seems to be comparable in practical eciency to conjugate gradient methods. The importance of using a nonmonotone line search is stressed, although some suggestions are made as to why the modi cation proposed by Raydan [22] often does not usually perform well for an illconditioned problem. Extensions for box constraints are discussed. A number of interesting open questions are put forward. Keywords BarzilaiBorwein method, steepest descent, elliptic systems, unconstrained optimization. 1
Nonmonotone Strategy for Minimization of Quadratics With Simple Constraints
, 1999
"... An algorithm for quadratic minimization with simple bounds is introduced, combining, as many wellknown methods do, active set strategies and projection steps. The novelty is that here the criterion for acceptance of a projected trial point is weaker than the usual ones, which are based on monotone ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
An algorithm for quadratic minimization with simple bounds is introduced, combining, as many wellknown methods do, active set strategies and projection steps. The novelty is that here the criterion for acceptance of a projected trial point is weaker than the usual ones, which are based on monotone decrease of the objective function. It is proved that convergence follows as in the monotone case. Numerical experiments with boundconstrained quadratic problems from CUTE collection show that the modified method is slightly more efficient, in practice, than its monotone counterpart and has a superior performance than the wellknown code LANCELOT for this class of problems. Key words. Quadratic programming, conjugate gradients, active set methods. AMS subject classifications. 65K10, 49M07, 65F15, 90C20 1 Introduction The problem of minimizing a quadratic function f subject to bounds on the variables has many practical applications. Many times, physical and Institute of Mathematics, S...
BOXQUACAN and the implementation of Augmented Lagrangian algorithms for minimization with inequality constraints
, 1998
"... BOXQUACAN is a trustregion boxconstraint optimization software developed at the Applied Mathematics Department of the University of Campinas. During the last five years, it has been used for solving many practical and academic problems with box constraints and it has been incorporated as suba ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
BOXQUACAN is a trustregion boxconstraint optimization software developed at the Applied Mathematics Department of the University of Campinas. During the last five years, it has been used for solving many practical and academic problems with box constraints and it has been incorporated as subalgorithm of Augmented Lagrangian methods for minimization with equality constraints and bounds. In this paper it is described its use in connection with Augmented Lagrangian algorithms where inequality constraints are handled without the addition of slack variables. Numerical experiments comparing a modified exponential Lagrangian method and the most classical Augmented Lagrangian are presented. Institute of Mathematics, University of Campinas, CP 6065, 13081970 Campinas SP, Brazil. This work was supported by PRONEX, FAPESP (grant 9037246), FINEP, CNPq, FAEPUNICAMP. 1 1 Introduction Box constrained optimization is a well developed area of numerical analysis. It consists on the m...
Spectral Gradient Methods for Linearly Constrained
"... Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is de ned for the case in which no simple bounds are present. This algorithm can be viewed as a quasiNewton method in which the approximate Hessians ..."
Abstract
 Add to MetaCart
Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is de ned for the case in which no simple bounds are present. This algorithm can be viewed as a quasiNewton method in which the approximate Hessians satisfy a weak secant equation.
A nonmonotonic method for largescale nonnegative least squares
, 2010
"... We present a new algorithm for nonnegative leastsquares (NNLS). Our algorithm extends the unconstrained quadratic optimization algorithm of Barzilai and Borwein (BB) (J. Barzilai and J. M. Borwein; TwoPoint Step Size Gradient Methods. IMA J. Numerical Analysis; 1988.) to handle nonnegativity const ..."
Abstract
 Add to MetaCart
We present a new algorithm for nonnegative leastsquares (NNLS). Our algorithm extends the unconstrained quadratic optimization algorithm of Barzilai and Borwein (BB) (J. Barzilai and J. M. Borwein; TwoPoint Step Size Gradient Methods. IMA J. Numerical Analysis; 1988.) to handle nonnegativity constraints. Our extension differs in several basic aspects from other constrained BB variants. The most notable difference is our modified computation of the BB stepsize that takes into account the nonnegativity constraints. We further refine the stepsize computation by introducing a stepsize scaling strategy that, in combination with orthogonal projections onto the nonnegative quadrant, yields an efficient NNLS algorithm. We compare our algorithm with both established convex solvers and a specialized NNLS method: on several synthetic and realworld datasets we observe highly competitive empirical performance.