Results 21  30
of
732
Random knapsack in expected polynomial time
 IN PROC. 35TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING (STOC2003
, 2003
"... We present the first averagecase analysis proving a polynomial upper bound on the expected running time of an exact algorithm for the 0/1 knapsack problem. In particular, we prove for various input distributions, that the number of Paretooptimal knapsack fillings is polynomially bounded in the num ..."
Abstract

Cited by 48 (10 self)
 Add to MetaCart
We present the first averagecase analysis proving a polynomial upper bound on the expected running time of an exact algorithm for the 0/1 knapsack problem. In particular, we prove for various input distributions, that the number of Paretooptimal knapsack fillings is polynomially bounded in the number of available items. An algorithm by Nemhauser and Ullmann can enumerate these solutions very efficiently so that a polynomial upper bound on the number of Paretooptimal solutions implies an algorithm with expected polynomial running time. The random input model underlying our analysis is quite general and not restricted to a particular input distribution. We assume adversarial weights and randomly drawn profits (or vice versa). Our analysis covers general probability distributions with finite mean and, in its most general form, can even handle different probability distributions for the profits of different items. This feature enables us to study the effects of correlations between profits and weights. Our analysis confirms and explains practical studies showing that socalled strongly correlated instances are harder to solve than weakly correlated ones.
A Cutting Plane Algorithm for Multicommodity Survivable Network Design Problems
, 1995
"... Wepresentacuttingplanealgorithm forsolving thefollowingnetworkdesignproblemintelecommunications: given pointtopointtraffic demandsinanetwork,specified survivabilityrequirementsanda discretecost/capacityfunctionforeachlink, find minimumcostcapacityexpansionssatisfyingthegiven demands. The algori ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
Wepresentacuttingplanealgorithm forsolving thefollowingnetworkdesignproblemintelecommunications: given pointtopointtraffic demandsinanetwork,specified survivabilityrequirementsanda discretecost/capacityfunctionforeachlink, find minimumcostcapacityexpansionssatisfyingthegiven demands. The algorithm is based onthe polyhedralstudyintheaccompanying paper [16]. We describe the underlying problem,the model and the main ingredients in our algorithm: initial formulation,feasibility test, separation for strong cutting planes and primal heuristics. Computational results for a set of realworld problems are reported.
A Polyhedral Approach to Multicommodity Survivable Network Design
 Numerische Mathematik
, 1993
"... The design of costefficient networks satisfying certain survivability ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
(Show Context)
The design of costefficient networks satisfying certain survivability
Complexity of scheduling multiprocessor tasks with prespecified processor allocations
 DISCRETE APPLIED MATHEMATICS
, 1994
"... ..."
A Dynamic Programming Approach for Consistency and Propagation for Knapsack Constraints
 Annals of Operations Research
, 2001
"... Knapsack constraints are a key modeling structure in constraint programming. These constraints are normally handled with simple bounding arguments. We propose a dynamic programming structure to represent these constraints. With this structure, we show how to achieve hyperarc consistency, to determi ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Knapsack constraints are a key modeling structure in constraint programming. These constraints are normally handled with simple bounding arguments. We propose a dynamic programming structure to represent these constraints. With this structure, we show how to achieve hyperarc consistency, to determine infeasibility before all variables are set, to generate all solutions quickly, and to update the structure after domain reduction. Preliminary testing on a dicult set of multiple knapsack instances shows signicant reduction in branching, though an eective implementation is needed in order to reduce computation time. Keywords: Global Constraints, Dynamic Programming, Knapsack Constraints 1.
A multiobjective evolutionary algorithm based on decomposition
 IEEE Transactions on Evolutionary Computation, Accepted
, 2007
"... 1 Decomposition is a basic strategy in traditional multiobjective optimization. However, this strategy has not yet widely used in multiobjective evolutionary optimization. This paper proposes a multiobjective evolutionary algorithm based on decomposition (MOEA/D). It decomposes a MOP into a number o ..."
Abstract

Cited by 45 (15 self)
 Add to MetaCart
(Show Context)
1 Decomposition is a basic strategy in traditional multiobjective optimization. However, this strategy has not yet widely used in multiobjective evolutionary optimization. This paper proposes a multiobjective evolutionary algorithm based on decomposition (MOEA/D). It decomposes a MOP into a number of scalar optimization subproblems and optimizes them simultaneously. Each subproblem is optimized by using information from its several neighboring subproblems, which makes MOEA/D have lower computational complexity at each generation than MOGLS and NSGAII. Experimental results show that it outperforms or performs similarly to MOGLS and NSGAII on multiobjective 01 knapsack problems and continuous multiobjective optimization problems. Index Terms multiobjective optimization, decomposition, evolutionary algorithms, memetic algorithms, Pareto optimality, computational complexity. I.
Robust submodular observation selection
, 2008
"... In many applications, one has to actively select among a set of expensive observations before making an informed decision. For example, in environmental monitoring, we want to select locations to measure in order to most effectively predict spatial phenomena. Often, we want to select observations wh ..."
Abstract

Cited by 44 (4 self)
 Add to MetaCart
(Show Context)
In many applications, one has to actively select among a set of expensive observations before making an informed decision. For example, in environmental monitoring, we want to select locations to measure in order to most effectively predict spatial phenomena. Often, we want to select observations which are robust against a number of possible objective functions. Examples include minimizing the maximum posterior variance in Gaussian Process regression, robust experimental design, and sensor placement for outbreak detection. In this paper, we present the Submodular Saturation algorithm, a simple and efficient algorithm with strong theoretical approximation guarantees for cases where the possible objective functions exhibit submodularity, an intuitive diminishing returns property. Moreover, we prove that better approximation algorithms do not exist unless NPcomplete problems admit efficient algorithms. We show how our algorithm can be extended to handle complex cost functions (incorporating nonunit observation cost or communication and path costs). We also show how the algorithm can be used to nearoptimally trade off expectedcase (e.g., the Mean Square Prediction Error in Gaussian Process regression) and worstcase (e.g., maximum predictive variance) performance. We show that many important machine learning problems fit our robust submodular observation selection formalism, and provide extensive empirical evaluation on several realworld problems. For Gaussian Process regression, our algorithm compares favorably with stateoftheart heuristics described in the geostatistics literature, while being simpler, faster and providing theoretical guarantees. For robust experimental design, our algorithm performs favorably compared to SDPbased algorithms.
an investigation of a hyperheuristic genetic algorithm applied to a trainer scheduling problem
 Proceedings of the Congress on Evolutionary Computation 2002, CEC 2002
, 2002
"... AbstractThis paper investigates a genetic algorithm based hyperheuristic (hyperGA) for scheduling geographically distributed training staff and courses. The aim of the hyperGA is to evolve a goodquality heuristic for each given instance of the problem and use this to find a solution by applying ..."
Abstract

Cited by 43 (12 self)
 Add to MetaCart
(Show Context)
AbstractThis paper investigates a genetic algorithm based hyperheuristic (hyperGA) for scheduling geographically distributed training staff and courses. The aim of the hyperGA is to evolve a goodquality heuristic for each given instance of the problem and use this to find a solution by applying a suitable ordering from a set of lowlevel heuristics. Since the user only supplies a number of lowlevel problemspecific heuristics and an evaluation function, the hyperheuristic can easily be reimplemented for a different type of problem, and we would expect it to be robust across a wide range of problem instances. We show that the problem can be solved successfully by a hyperGA, presenting results for four versions of the hyperGA as well as a range of simpler heuristics and applying them to five test data set 1.
An Evolutionary Approach to Combinatorial Optimization Problems
 PROCEEDINGS OF THE 22ND ANNUAL ACM COMPUTER SCIENCE CONFERENCE
, 1994
"... The paper reports on the application of genetic algorithms, probabilistic search algorithms based on the model of organic evolution, to NPcomplete combinatorial optimization problems. In particular, the subset sum, maximum cut, and minimum tardy task problems are considered. Except for the fitness ..."
Abstract

Cited by 43 (5 self)
 Add to MetaCart
The paper reports on the application of genetic algorithms, probabilistic search algorithms based on the model of organic evolution, to NPcomplete combinatorial optimization problems. In particular, the subset sum, maximum cut, and minimum tardy task problems are considered. Except for the fitness function, no problemspecific changes of the genetic algorithm are required in order to achieve results of high quality even for the problem instances of size 100 used in the paper. For constrained problems, such as the subset sum and the minimum tardy task, the constraints are taken into account by incorporating a graded penalty term into the fitness function. Even for large instances of these highly multimodal optimization problems, an iterated application of the genetic algorithm is observed to find the global optimum within a number of runs. As the genetic algorithm samples only a tiny fraction of the search space, these results are quite encouraging.
Optimized TensorProduct Approximation Spaces
"... . This paper is concerned with the construction of optimized grids and approximation spaces for elliptic differential and integral equations. The main result is the analysis of the approximation of the embedding of the intersection of classes of functions with bounded mixed derivatives in standard S ..."
Abstract

Cited by 43 (15 self)
 Add to MetaCart
. This paper is concerned with the construction of optimized grids and approximation spaces for elliptic differential and integral equations. The main result is the analysis of the approximation of the embedding of the intersection of classes of functions with bounded mixed derivatives in standard Sobolev spaces. Based on the framework of tensorproduct biorthogonal wavelet bases and stable subspace splittings, the problem is reduced to diagonal mappings between Hilbert sequence spaces. We construct operator adapted finiteelement subspaces with a lower dimension than the standard fullgrid spaces. These new approximation spaces preserve the approximation order of the standard fullgrid spaces, provided that certain additional regularity assumptions are fulfilled. The form of the approximation spaces is governed by the ratios of the smoothness exponents of the considered classes of functions. We show in which cases the so called curse of dimensionality can be broken. The theory covers e...