Results 1  10
of
16
A Minimal Algorithm for the MultipleChoice Knapsack Problem.
 European Journal of Operational Research
, 1994
"... The MultipleChoice Knapsack Problem is defined as a 01 Knapsack Problem with the addition of disjoined multiplechoice constraints. As for other knapsack problems most of the computational effort in the solution of these problems is used for sorting and reduction. But although O(n) algorithms whic ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
The MultipleChoice Knapsack Problem is defined as a 01 Knapsack Problem with the addition of disjoined multiplechoice constraints. As for other knapsack problems most of the computational effort in the solution of these problems is used for sorting and reduction. But although O(n) algorithms which solves the linear MultipleChoice Knapsack Problem without sorting have been known for more than a decade, such techniques have not been used in enumerative algorithms.
A minimal algorithm for the 01 Knapsack Problem.
 Operations Research
, 1994
"... Although several large sized 01 Knapsack Problems (KP) may be easily solved, it is often the case that most of the computational eort is used for preprocessing, i.e. sorting and reduction. In order to avoid this problem it has been proposed to solve the socalled core of the problem: A Knapsack ..."
Abstract

Cited by 41 (10 self)
 Add to MetaCart
Although several large sized 01 Knapsack Problems (KP) may be easily solved, it is often the case that most of the computational eort is used for preprocessing, i.e. sorting and reduction. In order to avoid this problem it has been proposed to solve the socalled core of the problem: A Knapsack Problem de ned on a small subset of the variables. But the exact core cannot be identi ed without solving KP, so till now approximated core sizes had to be used.
New Trends in Exact Algorithms for the 01 Knapsack Problem
, 1997
"... While the 1980s were focused on the solution of large sized "easy" knapsack problems, this decade has brought several new algorithms, which are able to solve "hard" large sized instances. We will give an overview of the recent techniques for solving hard knapsack problems, with special emphasis on t ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
While the 1980s were focused on the solution of large sized "easy" knapsack problems, this decade has brought several new algorithms, which are able to solve "hard" large sized instances. We will give an overview of the recent techniques for solving hard knapsack problems, with special emphasis on the addition of cardinality constraints, dynamic programming, and rudimentary divisibility. Computational results, comparing all recent algorithms, are presented. 1 Introduction We consider the classical 01 Knapsack Problem (KP) where a subset of n given items has to be packed in a knapsack of capacity c. Each item has a profit p j and a weight w j and the problem is to select a subset of the items whose total weight does not exceed c and whose total profit is a maximum. We assume, without loss of generality, that all input data are positive integers. Introducing the binary decision variables x j , with x j = 1 if item j is selected, and x j = 0 otherwise, we get the ILPmodel: maximize z =...
Core problems in Knapsack Algorithms.
 Operations Research
, 1994
"... Since Balas and Zemel a dozen years ago introduced the socalled core problem as an efficient way of solving the Knapsack Problem, all the most successful algorithms have been based on this idea. Balas and Zemel proved, that there is a high probability for finding an optimal solution in the core, th ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Since Balas and Zemel a dozen years ago introduced the socalled core problem as an efficient way of solving the Knapsack Problem, all the most successful algorithms have been based on this idea. Balas and Zemel proved, that there is a high probability for finding an optimal solution in the core, thus avoiding to consider the remaining items. However this paper demonstrates, that even for randomly generated data instances, the core problem may degenerate, making it difficult to obtain a reasonable solution. This behavior has not been noticed before due to inadequate testing, since the capacity usually is chosen such that the core problem becomes as easy as possible. A model for the expected hardness of a core problem as function of the capacity is presented, and it is demonstrated that the hitherto applied test instances are among the easiest possible. As a consequence we propose a series of new randomly generated test instances, and show how recent algorithms behave when applied to these problems.
The Core Concept for the Multidimensional Knapsack Problem
 IN EVOLUTIONARY COMPUTATION IN COMBINATORIAL OPTIMIZATION  EVOCOP 2006
, 2006
"... We present the newly developed core concept for the Multidimensional Knapsack Problem (MKP) which is an extension of the classical concept for the onedimensional case. The core for the multidimensional problem is defined in dependence of a chosen efficiency function of the items, since no singl ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
We present the newly developed core concept for the Multidimensional Knapsack Problem (MKP) which is an extension of the classical concept for the onedimensional case. The core for the multidimensional problem is defined in dependence of a chosen efficiency function of the items, since no single obvious efficiency measure is available for MKP. An empirical study on the cores of widelyused benchmark instances is presented, as well as experiments with different approximate core sizes. Furthermore we describe a memetic algorithm and a relaxation guided variable neighborhood search for the MKP, which are applied to the original and to the core problems. The experimental results show that given a fixed runtime, the di#erent metaheuristics as well as a general purpose integer linear programming solver yield better solution when applied to approximate core problems of fixed size.
The Multidimensional Knapsack Problem: Structure and Algorithms
, 2007
"... We study the multidimensional knapsack problem, present some theoretical and empirical results about its structure, and evaluate different Integer Linear Programming (ILP) based, metaheuristic, and collaborative approaches for it. We start by considering the distances between optimal solutions to th ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We study the multidimensional knapsack problem, present some theoretical and empirical results about its structure, and evaluate different Integer Linear Programming (ILP) based, metaheuristic, and collaborative approaches for it. We start by considering the distances between optimal solutions to the LPrelaxation and the original problem and then introduce a new core concept for the MKP, which we study extensively. The empirical analysis is then used to develop new concepts for solving the MKP using ILPbased and memetic algorithms. Different collaborative combinations of the presented methods are discussed and evaluated. Further computational experiments with longer runtimes are also performed in order to compare the solutions of our approaches to the best known solutions of another so far leading approach for common MKP benchmark instances. The extensive computational experiments show the effectiveness of the proposed methods, which yield highly competitive results in significantly shorter runtimes than previously described approaches.
Systematic Integration of Parameterized Local Search into Evolutionary Algorithms
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 2004
"... Applicationspecific, parameterized local search algorithms (PLSAs), in which optimization accuracy can be traded off with run time, arise naturally in many optimization contexts. We introduce a novel approach, called simulated heating, for systematically integrating parameterized local search into ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Applicationspecific, parameterized local search algorithms (PLSAs), in which optimization accuracy can be traded off with run time, arise naturally in many optimization contexts. We introduce a novel approach, called simulated heating, for systematically integrating parameterized local search into evolutionary algorithms (EAs). Using the framework of simulated heating, we investigate both static and dynamic strategies for systematically managing the tradeoff between PLSA accuracy and optimization effort. Our goal is to achieve maximum solution quality within a fixed optimization time budget. We show that the simulated heating technique better utilizes the given optimization time resources than standard hybrid methods that employ fixed parameters, and that the technique is less sensitive to these parameter settings. We apply this framework to three different optimization problems, compare our results to the standard hybrid methods, and show quantitatively that careful management of this tradeoff is necessary to achieve the full potential of an EA/PLSA combination.
Strongly Correlated Knapsack Problems Are Trivial to Solve
 Proceedings CO96, Imperial College of Science, Technology and Medicine, London 2729
, 1996
"... We consider a variant of the 01 Knapsack Problem, where the pro t of each item corresponds to its weight plus a xed constant. These socalled Strongly Correlated Knapsack Problems have attained much interest due to their apparent hardness and wide applicability in several xedcharge problems. ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We consider a variant of the 01 Knapsack Problem, where the pro t of each item corresponds to its weight plus a xed constant. These socalled Strongly Correlated Knapsack Problems have attained much interest due to their apparent hardness and wide applicability in several xedcharge problems.
Combining (integer) linear programming techniques and metaheuristics for combinatorial optimization
 of Studies in Computational Intelligence
, 2008
"... Summary. Several different ways exist for approaching hard optimization problems. Mathematical programming techniques, including (integer) linear programming based methods, and metaheuristic approaches are two highly successful streams for combinatorial problems. These two have been established by d ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Summary. Several different ways exist for approaching hard optimization problems. Mathematical programming techniques, including (integer) linear programming based methods, and metaheuristic approaches are two highly successful streams for combinatorial problems. These two have been established by different communities more or less in isolation from each other. Only over the last years a larger number of researchers recognized the advantages and huge potentials of building hybrids of mathematical programming methods and metaheuristics. In fact, many problems can be practically solved much better by exploiting synergies between these different approaches than by “pure ” traditional algorithms. The crucial issue is how mathematical programming methods and metaheuristics should be combined for achieving those benefits. Many approaches have been proposed in the last few years. After giving a brief introduction to the basics of integer linear programming, this chapter surveys existing techniques for such combinations and classifies them into ten methodological categories. 1
An O(nr) algorithm for the Subsetsum Problem  and other balanced knapsack algorithms
, 1995
"... A new technique called balancing is presented for the solution of Knapsack Problems. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A new technique called balancing is presented for the solution of Knapsack Problems.