Results 1  10
of
22
An effective implementation of the linkernighan traveling salesman heuristic
 European Journal of Operational Research
, 2000
"... This report describes an implementation of the LinKernighan heuristic, one of the most successful methods for generating optimal or nearoptimal solutions for the symmetric traveling salesman problem. Computational tests show that the implementation is highly effective. It has found optimal solution ..."
Abstract

Cited by 188 (1 self)
 Add to MetaCart
(Show Context)
This report describes an implementation of the LinKernighan heuristic, one of the most successful methods for generating optimal or nearoptimal solutions for the symmetric traveling salesman problem. Computational tests show that the implementation is highly effective. It has found optimal solutions for all solved problem instances we have been able to obtain, including a 7397city problem (the largest nontrivial problem instance solved to optimality today). Furthermore, the algorithm has improved the best known solutions for a series of largescale problems with unknown optima, among these an 85900city problem. 1.
On a modified subgradient algorithm for dual problems via sharp augmented Lagrangian
 Journal of Global Optimization
, 2006
"... We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condit ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condition for existence of a dual solution. Using a practical selection of the stepsize parameters, we demonstrate the algorithm and its advantages on test problems, including an integer programming and an optimal control problem. Key words: Nonconvex programming; nonsmooth optimization; augmented Lagrangian; sharp Lagrangian; subgradient optimization.
Convergence of a Simple Subgradient Level Method
 Math. Programming
, 1998
"... We study the subgradient projection method for convex optimization with Brannlund 's level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature. Key words. Nondifferentiable optimization, subgrad ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We study the subgradient projection method for convex optimization with Brannlund 's level control for estimating the optimal value. We establish global convergence in objective values without additional assumptions employed in the literature. Key words. Nondifferentiable optimization, subgradient optimization. 1 Introduction We consider a method for the minimization problem f = inf S f under the following assumptions. S is a nonempty closed convex set in IR n , f : IR n ! IR is a convex function, for each x 2 S we can compute f(x) and a subgradient g f (x) 2 @f(x) of f at x, and for each x 2 IR n we can find P S x = arg min y2S jx \Gamma yj, its orthogonal projection on S, where j \Delta j is the Euclidean norm. The optimal set Arg min S f may be empty. Given the kth iterate x k 2 S and a target level f k lev that estimates f , we may use H k = n x : f(x k ) + D g k ; x \Gamma x k E f k lev o with g k = g f (x k ) 2 @f(x k ) (1.1) to approximate t...
Subgradient optimization methods in integer programming with an application to a radiation therapy problem
, 2003
"... ii iii Acknowledgments I would like to express my sincere gratitude to my supervisor Prof. Dr. Horst W. Hamacher for making this research work possible. His guidance, valuable suggestions and great support has been a good influence for this work to be successfully done. I want to express my thanks a ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
ii iii Acknowledgments I would like to express my sincere gratitude to my supervisor Prof. Dr. Horst W. Hamacher for making this research work possible. His guidance, valuable suggestions and great support has been a good influence for this work to be successfully done. I want to express my thanks also to Prof. Dr. Francesco Maffioli for his good will and effort to read and evaluate my thesis. I am indebted also to Prof. Dr. Helmut Neunzert who not only gave me the chance to come to Germany but also genuinely and fatherly concerned about my personal life, in general, and academic work, in particular. I would also like to thank Prof. Dr. Dietmar Schweigert for his support during the beginning of my Ph.D studies. Many thanks go to all my colleagues at Optimization Group, AG Hamacher, in the University of Kaiserslautern for the good working atmosphere as well as for their friendly and unreserved support which have always made me to feel like being in my family at
Improving traditional subgradient scheme for Lagrangean relaxation: an application to location problems
 International Journal of Mathematical Algorithms
, 1999
"... Lagrangean relaxation is largely used to solve combinatorial optimization problems. A known problem for Lagrangean relaxation application is the definition of convenient step size control in subgradient like methods. Even preserving theoretical convergence properties, a wrong defined control can ref ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Lagrangean relaxation is largely used to solve combinatorial optimization problems. A known problem for Lagrangean relaxation application is the definition of convenient step size control in subgradient like methods. Even preserving theoretical convergence properties, a wrong defined control can reflect in performance and increase computational times, a critical point in large scale instances. We show in this work how to accelerate a classical subgradient method, using the local information of the surrogate constraints relaxed in the Lagrangean relaxation. It results in a onedimensional search that corrects the step size and is independent of the step size control used. The application to Capacitated and Uncapacitated Facility Location problems is shown. Several computational tests confirm the superiority of this scheme. Key words: Location problems, Lagrangean relaxation, Subgradient method. 1. Introduction Facility location is the problem of locating a number of facilities from a s...
An Inexact Modified Subgradient Algorithm for Nonconvex Optimization ∗
, 2008
"... We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We propose and analyze an inexact version of the modified subgradient (MSG) algorithm, which we call the IMSG algorithm, for nonsmooth and nonconvex optimization over a compact set. We prove that under an approximate, i.e. inexact, minimization of the sharp augmented Lagrangian, the main convergence properties of the MSG algorithm are preserved for the IMSG algorithm. Inexact minimization may allow to solve problems with less computational effort. We illustrate this through test problems, including an optimal bang–bang control problem, under several different inexactness schemes.
An InfeasiblePoint Subgradient Method Using Adaptive Approximate Projections ⋆
"... Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorithm). In particular, the iterates in our method can be infeasible throughout the whole procedure. Nevertheless, we provide conditions which ensure convergence to an optimal feasible point under suitable assumptions. One convergence result deals with step size sequences that are fixed a priori. Two other results handle dynamic Polyaktype step sizes depending on a lower or upper estimate of the optimal objective function value, respectively. Additionally, we briefly sketch two applications: Optimization with convex chance constraints, and finding the minimum ℓ1norm solution to an underdetermined linear system, an important problem in Compressed Sensing.
Fast and Low Complexity Blind Equalization via Subgradient Projections
, 2005
"... We propose a novel blind equalization method based on subgradient search over a convex cost surface. This is an alternative to the existing iterative blind equalization approaches such as the Constant Modulus Algorithm (CMA) which often suffer from the convergence problems caused by their nonconvex ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We propose a novel blind equalization method based on subgradient search over a convex cost surface. This is an alternative to the existing iterative blind equalization approaches such as the Constant Modulus Algorithm (CMA) which often suffer from the convergence problems caused by their nonconvex cost functions. The proposed method is an iterative algorithm, (called SubGradient based Blind Algorithm (SGBA) ) for both real and complex constellations, with a very simple update rule. It is based on the minimization of the l ∞ norm of the equalizer output under a linear constraint on the equalizer coefficients using subgradient iterations. The algorithm has a nice convergence behavior attributed to the convex l ∞ cost surface as well as the step size selection rules associated with the subgradient search. We illustrate the performance of the algorithm using examples with both complex and real constellations, where we show that the proposed algorithm’s convergence is less sensitive to initial point selection, and a fast convergence behavior can be achieved with a judicious selection of step sizes. Furthermore, the amount of data required for the training of the equalizer is significantly lower than most of the existing schemes.
SHARP: A Scalable Framework for Dynamic Joint Replica Placement and Request Routing Scheduling
"... PRR) scheduling in content delivery networks. After grouping similar proxies and modeling them by a single section, we propose a hierarchical scheduling framework to greatly reduce the dimensions of the mathematical formulation. In every phase the obtained shaped formulation has an easysolvable for ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
PRR) scheduling in content delivery networks. After grouping similar proxies and modeling them by a single section, we propose a hierarchical scheduling framework to greatly reduce the dimensions of the mathematical formulation. In every phase the obtained shaped formulation has an easysolvable form and the complete optimization process is highly scalable. To verify the scalability and effectiveness of our approach, SHARP is evaluated by comprehensive experiment settings which are derived from realistic data/topology of an operational commercial CDN. I.
Exact and Heuristic Approaches for Assignment in MultipleContainer Packing
, 1997
"... This paper deals with cutting/packing problems in which there is a set of pieces to be allocated and arranged in a set of "containers." In an apparel manufacturing application, the containers might be unused areas of the fabric after large pieces have been placed, and the pieces of interes ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper deals with cutting/packing problems in which there is a set of pieces to be allocated and arranged in a set of "containers." In an apparel manufacturing application, the containers might be unused areas of the fabric after large pieces have been placed, and the pieces of interest might be the smaller pieces. In a sheet metal application, the containers could be the sheets themselves, and the pieces the entire set of pieces to be arranged. The specific problem addressed takes as input a set of groups (of pieces), and mappings from pieces to groups and groups to containers. The method in which the groups are generated and the particular geometric constraints (e.g., translation only, or translation plus rotation) is not critical for the methods developed here. This paper presents an integer programming formulation of the multiplecontainer group assignment problem (MCGAP). Based on long and/or highly variable solution times for some problem instances, a Lagrangian heuristic pro...