Results 1 
3 of
3
Improving traditional subgradient scheme for Lagrangean relaxation: an application to location problems
 International Journal of Mathematical Algorithms
, 1999
"... Lagrangean relaxation is largely used to solve combinatorial optimization problems. A known problem for Lagrangean relaxation application is the definition of convenient step size control in subgradient like methods. Even preserving theoretical convergence properties, a wrong defined control can ref ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Lagrangean relaxation is largely used to solve combinatorial optimization problems. A known problem for Lagrangean relaxation application is the definition of convenient step size control in subgradient like methods. Even preserving theoretical convergence properties, a wrong defined control can reflect in performance and increase computational times, a critical point in large scale instances. We show in this work how to accelerate a classical subgradient method, using the local information of the surrogate constraints relaxed in the Lagrangean relaxation. It results in a onedimensional search that corrects the step size and is independent of the step size control used. The application to Capacitated and Uncapacitated Facility Location problems is shown. Several computational tests confirm the superiority of this scheme. Key words: Location problems, Lagrangean relaxation, Subgradient method. 1. Introduction Facility location is the problem of locating a number of facilities from a s...
An Inexact Modified Subgradient Algorithm for Nonconvex Optimization ∗
, 2008
"... We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence properties of the MSG algorithm are preserved for the IMSG algorithm. Inexact minimization may allow to solve problems with less computational effort. We illustrate this through test problems, including an optimal bang–bang control problem, under several different inexactness schemes.
The volume algorithm: producing . . .
 MATH. PROGRAM., SER. A 87: 385–399 (2000)
, 2000
"... We present an extension to the subgradient algorithm to produce primal as well as dual solutions. It can be seen as a fast way to carry out an approximation of DantzigWolfe decomposition. This gives a fast method for producing approximations for large scale linear programs. It is based on a new the ..."
Abstract
 Add to MetaCart
We present an extension to the subgradient algorithm to produce primal as well as dual solutions. It can be seen as a fast way to carry out an approximation of DantzigWolfe decomposition. This gives a fast method for producing approximations for large scale linear programs. It is based on a new theorem in linear programming duality. We present successful experience with linear programs coming from set partitioning, set covering, maxcut and plant location. Key words. subgradient algorithm – DantzigWolfe decomposition – large scale linear programming