Results 1  10
of
100
The traveling salesman problem
, 1994
"... This paper presents a selfcontained introduction into algorithmic and computational aspects of the traveling salesman problem and of related problems, along with their theoretical prerequisites as seen from the point of view of an operations researcher who wants to solve practical problem instances ..."
Abstract

Cited by 130 (5 self)
 Add to MetaCart
This paper presents a selfcontained introduction into algorithmic and computational aspects of the traveling salesman problem and of related problems, along with their theoretical prerequisites as seen from the point of view of an operations researcher who wants to solve practical problem instances. Extensive computational results are reported on most of the algorithms described. Optimal solutions are reported for instances with sizes up to several thousand nodes as well as heuristic solutions with provably very high quality for larger instances. This is a preliminary version of one of the chapters of the volume “Networks”
2Layer Straightline Crossing Minimization: Performance of Exact and Heuristic Algorithms
, 1997
"... We present algorithms for the two layer straightline crossing minimization problem that are able to compute exact optima. Our computational results lead us to the conclusion that there is no need for heuristics if one layer is fixed, even though the problem is NPhard, and that for the general probl ..."
Abstract

Cited by 76 (6 self)
 Add to MetaCart
(Show Context)
We present algorithms for the two layer straightline crossing minimization problem that are able to compute exact optima. Our computational results lead us to the conclusion that there is no need for heuristics if one layer is fixed, even though the problem is NPhard, and that for the general problem with two variable layers, true optima can be computed for sparse instances in which the smaller layer contains up to 15 nodes. For bigger instances, the iterated barycenter method turns out to be the method of choice among several popular heuristics whose performance we could assess by comparing their results to optimum solutions.
An Experimental Evaluation of a Scatter Search for the Linear Ordering Problem
, 1999
"... Scatter search is a populationbased method that has recently been shown to yield promising outcomes for solving combinatorial and nonlinear optimization problems. Based on formulations originally proposed in the 1960s for combining decision rules and problem constraints, such as the surrogate const ..."
Abstract

Cited by 54 (19 self)
 Add to MetaCart
Scatter search is a populationbased method that has recently been shown to yield promising outcomes for solving combinatorial and nonlinear optimization problems. Based on formulations originally proposed in the 1960s for combining decision rules and problem constraints, such as the surrogate constraint method, scatter search uses strategies for combining solution vectors that have proved effective in a variety of problem settings. In this paper, we present a scatter search implementation designed to find high quality solutions for the NPhard linear ordering problem, which has a significant number of applications in practice. The LOP, for example, is equivalent to the socalled triangulation problem for inputoutput tables in economics. Our implementation goes beyond a simple exercise on applying scatter search, by incorporating innovative mechanisms to combine solutions and to create a balance between quality and diversification in the reference set. We also use a tracking process that generates solution statistics disclosing the nature of combinations and the ranks of antecedent solutions that produced the best final solutions. Our extensive computational experiments with more than 300 instances establishes the effectiveness of our procedure in relation to those previously identified to be best.
Intensification and diversification with elite tabu search solutions for the linear ordering problem
 Computers and Operations Research
, 1999
"... ABSTRACT In this paper, we develop a new heuristic procedure for the linear ordering problem (LOP). This NPhard problem has a significant number of applications in practice. The LOP, for example, is equivalent to the socalled triangulation problem for inputoutput tables in economics. In this pap ..."
Abstract

Cited by 46 (12 self)
 Add to MetaCart
(Show Context)
ABSTRACT In this paper, we develop a new heuristic procedure for the linear ordering problem (LOP). This NPhard problem has a significant number of applications in practice. The LOP, for example, is equivalent to the socalled triangulation problem for inputoutput tables in economics. In this paper we concentrate on matrices that arise in the context of this realworld application. The proposed algorithm is based on the tabu search methodology and incorporates strategies for search intensification and diversification. For search intensification, we experiment with path relinking, a strategy proposed several years ago in connection with tabu search, which has been rarely used in actual implementations. Extensive computational experiments with inputoutput tables show that the proposed procedure outperforms the best heuristics reported in the literature. Furthermore, the experiments also show the merit of achieving a balance between intensification and diversification in the search. Laguna, et al. / 2 1.
Integrality gaps for sheraliadams relaxations
 In Proceedings of the FortyFirst Annual ACM Symposium on Theory of Computing
, 2009
"... We prove strong lower bounds on integrality gaps of Sherali–Adams relaxations for MAX CUT, Vertex Cover, Sparsest Cut and other problems. Our constructions show gaps for Sherali–Adams relaxations that survive nδ rounds of lift and project. For MAX CUT and Vertex Cover, these show that even nδ rounds ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
(Show Context)
We prove strong lower bounds on integrality gaps of Sherali–Adams relaxations for MAX CUT, Vertex Cover, Sparsest Cut and other problems. Our constructions show gaps for Sherali–Adams relaxations that survive nδ rounds of lift and project. For MAX CUT and Vertex Cover, these show that even nδ rounds of Sherali–Adams do not yield a better than 2 − ε approximation. The main combinatorial challenge in constructing these gap examples is the construction of a fractional solution that is far from an integer solution, but yet admits consistent distributions of local solutions for all small subsets of variables. Satisfying this consistency requirement is one of the major hurdles to constructing Sherali–Adams gap examples. We present a modular recipe for achieving this, building on previous work on metrics with a local–global structure. We develop a conceptually simple geometric approach to constructing Sherali–Adams gap examples via constructions of consistent local SDP solutions. This geometric approach is surprisingly versatile. We construct Sherali–Adams gap examples for Unique Games based on our construction for MAX CUT together with a parallel repetition like procedure. This in turn allows us to obtain Sherali–Adams gap examples for any problem that has a Unique Games based hardness result (with some additional conditions on the reduction from Unique Games). Using this, we construct 2 − ε gap examples for Maximum Acyclic Subgraph that rules out any family of linear constraints with support at most nδ. 1
Pueblo: A hybrid pseudoboolean SAT solver
 Journal on Satisfiability, Boolean Modeling and Computation
, 2006
"... This paper introduces a new hybrid method for efficiently integrating PseudoBoolean (PB) constraints into generic SAT solvers in order to solve PB satisfiability and optimization problems. To achieve this, we adopt the cuttingplane technique to draw inferences among PB constraints and combine it w ..."
Abstract

Cited by 36 (0 self)
 Add to MetaCart
(Show Context)
This paper introduces a new hybrid method for efficiently integrating PseudoBoolean (PB) constraints into generic SAT solvers in order to solve PB satisfiability and optimization problems. To achieve this, we adopt the cuttingplane technique to draw inferences among PB constraints and combine it with generic implication graph analysis for conflictinduced learning. Novel features of our approach include a lightweight and efficient hybrid learning and backjumping strategy for analyzing PB constraints and CNF clauses in order to simultaneously learn both a CNF clause and a PB constraint with minimum overhead and use both to determine the backtrack level. Several techniques for handling the original and learned PB constraints are introduced. Overall, our method benefits significantly from the pruning power of the learned PB constraints, while keeping the overhead of adding them into the problem low. In this paper, we also address two other methods for solving PB problems, namely Integer Linear Programming (ILP) and preprocessing to CNF SAT, and present a thorough comparison between them and our hybrid method. Experimental comparison of our method against other hybrid approaches is also demonstrated. Additionally, we provide details of the MiniSATbased implementation of our solver Pueblo to enable the reader to construct a similar one.
Maximum Planar Subgraphs and Nice Embeddings: Practical Layout Tools
 ALGORITHMICA
, 1996
"... ..."
Progress in linear programmingbased algorithms for integer programming: An exposition
 INFORMS JOURNAL ON COMPUTING
, 2000
"... This paper is about modeling and solving mixed integer programming (MIP) problems. In the last decade, the use of mixed integer programming models has increased dramatically. Fifteen years ago, mainframe computers were required to solve problems with a hundred integer variables. Now it is possible t ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
(Show Context)
This paper is about modeling and solving mixed integer programming (MIP) problems. In the last decade, the use of mixed integer programming models has increased dramatically. Fifteen years ago, mainframe computers were required to solve problems with a hundred integer variables. Now it is possible to solve problems with thousands of integer variables on a personal computer and obtain provably good approximate solutions to problems such as set partitioning with millions of binary variables. These advances have been made possible by developments in modeling, algorithms, software, and hardware. This paper focuses on effective modeling, preprocessing, and the methodologies of branchandcut and branchandprice, which are the techniques that make it possible to treat problems with either a very large number of constraints or a very large number of variables. We show how these techniques are useful
Solving RealWorld Linear Ordering Problems . . .
, 1995
"... Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear prog ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear programming relaxations. A point which is a good warm start for a simplexbased cutting plane algorithm is generally not a good starting point for an interior point method. Techniques used to improve the warm start include attempting to identify cutting planes early and storing an old feasible point, which is used to help recenter when cutting planes are added. Computational results are described for some realworld problems; the algorithm appears to be competitive with a simplexbased cutting plane algorithm.