Results 1 
9 of
9
Nonmonotone spectral projected gradient methods on convex sets
 SIAM Journal on Optimization
, 2000
"... Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone lin ..."
Abstract

Cited by 133 (25 self)
 Add to MetaCart
Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone line search. In particular, the nonmonotone strategy is combined with the spectral gradient choice of steplength to accelerate the convergence process. In addition to the classical projected gradient nonlinear path, the feasible spectral projected gradient is used as a search direction to avoid additional trial projections during the onedimensional search process. Convergence properties and extensive numerical results are presented.
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
SPG: Software for ConvexConstrained Optimization
, 2001
"... this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial deriva ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial derivatives on an open set that contains Users of the software must supply subroutines to compute the function f(x), the gradient rf(x) and projections of an arbitrary point x onto Information about the Hessian matrix is not required and the storage requirements are minimal. Therefore, the algorithm is appropriate for largescale convexconstrained optimization problems with aordable projections onto the feasible set. Notice that the algorithm is also suitable for unconstrained optimization problems simply by setting = IR n
Gradient method with dynamical retards for largescale optimization problems
 Electronic Transactions on Numerical Analysis (ETNA
, 2003
"... Abstract. We consider a generalization of the gradient method with retards for the solution of largescale unconstrained optimization problems. Recently, the gradient method with retards was introduced to find global minimizers of largescale quadratic functions. The most interesting feature of this ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. We consider a generalization of the gradient method with retards for the solution of largescale unconstrained optimization problems. Recently, the gradient method with retards was introduced to find global minimizers of largescale quadratic functions. The most interesting feature of this method is that it does not involve a decrease in the objective function, which allows fast local convergence. On the other hand, nonmonotone globalization strategies, that preserve local behavior for the nonquadratic case, have proved to be very effective when associated with low storage methods. In this work, the gradient method with retards is generalized and combined in a dynamical way with nonmonotone globalization strategies to obtain a new method for minimizing nonquadratic functions, that can deal efficiently with large problems. Encouraging numerical experiments on wellknown test problems are presented. Key words. spectral gradient method, nonmonotone line search, BarzilaiBorwein method, PolakRibière method, Rayleigh quotient.
Minimization Subproblems and Heuristics for an Applied Clustering Problem
, 2001
"... A practical problem that requires the classification of a set of points of R^n using a criterion not sensitive to bounded outliers is studied in this paper. A fixedpoint (kmeans) algorithm is defined that uses an arbitrary distance function. Finite convergence is proved. A robust distance defined ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A practical problem that requires the classification of a set of points of R^n using a criterion not sensitive to bounded outliers is studied in this paper. A fixedpoint (kmeans) algorithm is defined that uses an arbitrary distance function. Finite convergence is proved. A robust distance defined by Boente, Fraiman and Yohai is selected for applications. Smooth approximations of this distance are defined and suitable heuristics are introduced to enhance the probability of finding global optimizers. A reallife example is presented and commented.
People’s Republic of China
, 2005
"... The cyclic Barzilai–Borwein method for unconstrained optimization ..."
AND
, 2005
"... In the cyclic Barzilai–Borwein (CBB) method, the same Barzilai–Borwein (BB) stepsize is reused for m consecutive iterations. It is proved that CBB is locally linearly convergent at a local minimizer with positive definite Hessian. Numerical evidence indicates that when m> n/2 � 3, where n is the pro ..."
Abstract
 Add to MetaCart
In the cyclic Barzilai–Borwein (CBB) method, the same Barzilai–Borwein (BB) stepsize is reused for m consecutive iterations. It is proved that CBB is locally linearly convergent at a local minimizer with positive definite Hessian. Numerical evidence indicates that when m> n/2 � 3, where n is the problem dimension, CBB is locally superlinearly convergent. In the special case m = 3 and n = 2, it is proved that the convergence rate is no better than linear, in general. An implementation of the CBB method, called adaptive cyclic Barzilai–Borwein (ACBB), combines a nonmonotone line search and an adaptive choice for the cycle length m. In numerical experiments using the CUTEr test problem library, ACBB performs better than the existing BB gradient algorithm, while it is competitive with the wellknown PRP+ conjugate gradient algorithm.