Results 1  10
of
10
Nonmonotone spectral projected gradient methods on convex sets
 SIAM Journal on Optimization
, 2000
"... Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone lin ..."
Abstract

Cited by 133 (25 self)
 Add to MetaCart
Abstract. Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo–Lampariello–Lucidi nonmonotone line search. In particular, the nonmonotone strategy is combined with the spectral gradient choice of steplength to accelerate the convergence process. In addition to the classical projected gradient nonlinear path, the feasible spectral projected gradient is used as a search direction to avoid additional trial projections during the onedimensional search process. Convergence properties and extensive numerical results are presented.
LargeScale ActiveSet BoxConstrained Optimization Method with Spectral Projected Gradients
 Computational Optimization and Applications
, 2001
"... A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradien ..."
Abstract

Cited by 59 (9 self)
 Add to MetaCart
A new activeset method for smooth boxconstrained minimization is introduced. The algorithm combines an unconstrained method, including a new linesearch which aims to add many constraints to the working set at a single iteration, with a recently introduced technique (spectral projected gradient) for dropping constraints from the working set. Global convergence is proved. A computer implementation is fully described and a numerical comparison assesses the reliability of the new algorithm. Keywords: Boxconstrained minimization, numerical methods, activeset strategies, Spectral Projected Gradient. 1
A BoxConstrained Optimization Algorithm With Negative Curvature Directions and Spectral Projected Gradients
, 2001
"... A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A practical algorithm for boxconstrained optimization is introduced. The algorithm combines an activeset strategy with spectral projected gradient iterations. In the interior of each face a strategy that deals eciently with negative curvature is employed. Global convergence results are given. Numerical results are presented. Keywords: box constrained minimization, active set methods, spectral projected gradients, dogleg path methods. AMS Subject Classication: 49M07, 49M10, 65K, 90C06, 90C20. 1
A new active set algorithm for box constrained Optimization
 SIAM Journal on Optimization
, 2006
"... Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
Abstract. An active set algorithm (ASA) for box constrained optimization is developed. The algorithm consists of a nonmonotone gradient projection step, an unconstrained optimization step, and a set of rules for branching between the two steps. Global convergence to a stationary point is established. For a nondegenerate stationary point, the algorithm eventually reduces to unconstrained optimization without restarts. Similarly, for a degenerate stationary point, where the strong secondorder sufficient optimality condition holds, the algorithm eventually reduces to unconstrained optimization without restarts. A specific implementation of the ASA is given which exploits the recently developed cyclic Barzilai–Borwein (CBB) algorithm for the gradient projection step and the recently developed conjugate gradient algorithm CG DESCENT for unconstrained optimization. Numerical experiments are presented using box constrained problems in the CUTEr and MINPACK2 test problem libraries. Key words. nonmonotone gradient projection, box constrained optimization, active set algorithm,
Optimizing the Packing of Cylinders into a Rectangular Container: A Nonlinear Approach
, 2003
"... The container loading problem has important industrial and commercial applications. An increase in the number of items in a container leads to a decrease in cost. For this reason the related optimization problem is of economic importance. In this work, a procedure based on a nonlinear decision pr ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
The container loading problem has important industrial and commercial applications. An increase in the number of items in a container leads to a decrease in cost. For this reason the related optimization problem is of economic importance. In this work, a procedure based on a nonlinear decision problem to solve the cylinder packing problem with identical diameters is presented. This formulation is based on the fact that the centers of the cylinders have to be inside the rectangular box de ned by the base of the container (a radius far from the frontier) and far from each other at least one diameter. With this basic premise the procedure tries to nd the maximum number of cylinder centers that satisfy these restrictions. The continuous nature of the problem is one of the reasons that motivated this study. A comparative study with other methods of the literature is presented and better results are achieved.
SPG: Software for ConvexConstrained Optimization
, 2001
"... this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial deriva ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
this paper we describe Fortran 77 software that implements the nonmonotone spectral projected gradient (SPG) algorithm. The SPG method applies to problems of the form min f(x) subject to x 2 ; where is a closed convex set in IR n . It is assumed that f is dened and has continuous partial derivatives on an open set that contains Users of the software must supply subroutines to compute the function f(x), the gradient rf(x) and projections of an arbitrary point x onto Information about the Hessian matrix is not required and the storage requirements are minimal. Therefore, the algorithm is appropriate for largescale convexconstrained optimization problems with aordable projections onto the feasible set. Notice that the algorithm is also suitable for unconstrained optimization problems simply by setting = IR n
Augmented Lagrangian algorithms based on the spectral projected gradient method for solving nonlinear programming problems
"... The Spectral Projected Gradient method (SPG) is an algorithm for largescale boundconstrained optimization introduced recently by Birgin, Martnez and Raydan. It is based on Raydan's unconstrained generalization of the BarzilaiBorwein method for quadratics. The SPG algorithm turned out to be surpri ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
The Spectral Projected Gradient method (SPG) is an algorithm for largescale boundconstrained optimization introduced recently by Birgin, Martnez and Raydan. It is based on Raydan's unconstrained generalization of the BarzilaiBorwein method for quadratics. The SPG algorithm turned out to be surprisingly eective for solving many largescale minimization problems with box constraints. Therefore, it is natural to test its performance for solving the subproblems that appear in nonlinear programming methods based on augmented Lagrangians. In this work, augmented Lagrangian methods which use SPG as underlying convexconstraint solver are introduced (ALSPG), and the methods are tested in two sets of problems. First, a meaningful subset of largescale nonlinearly constrained problems of the CUTE collection is solved and compared with the performance of LANCELOT. Second, a family of location problems in the minimax formulation is solved against the package FFSQP.
Partial Spectral Projected Gradient Method with ActiveSet Strategy for Linearly Constrained Optimization
, 2009
"... A method for linearly constrained optimization which modifies and generalizes recent boxconstraint optimization algorithms is introduced. The new algorithm is based on a relaxed form of Spectral Projected Gradient iterations. Intercalated with these projected steps, internal iterations restricted t ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
A method for linearly constrained optimization which modifies and generalizes recent boxconstraint optimization algorithms is introduced. The new algorithm is based on a relaxed form of Spectral Projected Gradient iterations. Intercalated with these projected steps, internal iterations restricted to faces of the polytope are performed, which enhance the efficiency of the algorithms. Convergence proofs are given and numerical experiments are included and commented. Software supporting this paper is available through the Tango
Minimization Subproblems and Heuristics for an Applied Clustering Problem
, 2001
"... A practical problem that requires the classification of a set of points of R^n using a criterion not sensitive to bounded outliers is studied in this paper. A fixedpoint (kmeans) algorithm is defined that uses an arbitrary distance function. Finite convergence is proved. A robust distance defined ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A practical problem that requires the classification of a set of points of R^n using a criterion not sensitive to bounded outliers is studied in this paper. A fixedpoint (kmeans) algorithm is defined that uses an arbitrary distance function. Finite convergence is proved. A robust distance defined by Boente, Fraiman and Yohai is selected for applications. Smooth approximations of this distance are defined and suitable heuristics are introduced to enhance the probability of finding global optimizers. A reallife example is presented and commented.
Spectral Projected Gradient methods: Review and Perspectives
"... Over the last two decades, it has been observed that using the gradient vector as a search direction in largescale optimization may lead to efficient algorithms. The effectiveness relies on choosing the step lengths according to novel ideas that are related to the spectrum of the underlying local H ..."
Abstract
 Add to MetaCart
Over the last two decades, it has been observed that using the gradient vector as a search direction in largescale optimization may lead to efficient algorithms. The effectiveness relies on choosing the step lengths according to novel ideas that are related to the spectrum of the underlying local Hessian rather than related to the standard decrease in the objective function. A review of these socalled spectral projected gradient methods for convex constrained optimization is presented. To illustrate the performance of these lowcost schemes, an optimization problem on the set of positive definite matrices is described.