Results 21  30
of
667
A Polyhedral Approach to Multicommodity Survivable Network Design
 Numerische Mathematik
, 1993
"... The design of costefficient networks satisfying certain survivability ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
(Show Context)
The design of costefficient networks satisfying certain survivability
An Evolutionary Approach to Combinatorial Optimization Problems
 PROCEEDINGS OF THE 22ND ANNUAL ACM COMPUTER SCIENCE CONFERENCE
, 1994
"... The paper reports on the application of genetic algorithms, probabilistic search algorithms based on the model of organic evolution, to NPcomplete combinatorial optimization problems. In particular, the subset sum, maximum cut, and minimum tardy task problems are considered. Except for the fitness ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
The paper reports on the application of genetic algorithms, probabilistic search algorithms based on the model of organic evolution, to NPcomplete combinatorial optimization problems. In particular, the subset sum, maximum cut, and minimum tardy task problems are considered. Except for the fitness function, no problemspecific changes of the genetic algorithm are required in order to achieve results of high quality even for the problem instances of size 100 used in the paper. For constrained problems, such as the subset sum and the minimum tardy task, the constraints are taken into account by incorporating a graded penalty term into the fitness function. Even for large instances of these highly multimodal optimization problems, an iterated application of the genetic algorithm is observed to find the global optimum within a number of runs. As the genetic algorithm samples only a tiny fraction of the search space, these results are quite encouraging.
Serverstorage virtualization: Integration and load balancing in data centers
 In Proceedings of IEEE/ACM Supercomputing
, 2008
"... Abstract—We describe the design of an agile data center with integrated server and storage virtualization technologies. Such data centers form a key building block for new cloud computing architectures. We also show how to leverage this integrated agility for nondisruptive load balancing in data ce ..."
Abstract

Cited by 41 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We describe the design of an agile data center with integrated server and storage virtualization technologies. Such data centers form a key building block for new cloud computing architectures. We also show how to leverage this integrated agility for nondisruptive load balancing in data centers across multiple resource layers servers, switches, and storage. We propose a novel load balancing algorithm called VectorDot for handling the hierarchical and multidimensional resource constraints in such systems. The algorithm, inspired by the successful Toyoda method for multidimensional knapsacks, is the first of its kind. We evaluate our system on a range of synthetic and real data center testbeds comprising of VMware ESX servers, IBM SAN Volume Controller, Cisco and Brocade switches. Experiments under varied conditions demonstrate the endtoend validity of our system and the ability of VectorDot to efficiently remove overloads on server, switch and storage nodes. I.
HyperHeuristics: Learning to Combine Simple Heuristics in BinPacking Problem
"... ..."
(Show Context)
Learning a Procedure That Can Solve Hard BinPacking Problems: a new GAbased approach to hyperheuristics
, 2003
"... The idea underlying hyperheuristics is to discover some combination of familiar, straightforward heuristics that performs very well across a whole range of problems. To be worthwhile, such a combination should outperform all of the constituent heuristics. In this paper we describe a novel messyGA ..."
Abstract

Cited by 41 (5 self)
 Add to MetaCart
(Show Context)
The idea underlying hyperheuristics is to discover some combination of familiar, straightforward heuristics that performs very well across a whole range of problems. To be worthwhile, such a combination should outperform all of the constituent heuristics. In this paper we describe a novel messyGAbased approach that learns such a heuristic combination for solving onedimensional binpacking problems. When applied to a large set of benchmark problems, the learned procedure finds an optimal solution for nearly 80% of them, and for the rest produces an answer very close to optimal. When compared with its own constituent heuristics, it ranks first in 98% of the problems.
Random knapsack in expected polynomial time
 IN PROC. 35TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING (STOC2003
, 2003
"... We present the first averagecase analysis proving a polynomial upper bound on the expected running time of an exact algorithm for the 0/1 knapsack problem. In particular, we prove for various input distributions, that the number of Paretooptimal knapsack fillings is polynomially bounded in the num ..."
Abstract

Cited by 41 (10 self)
 Add to MetaCart
We present the first averagecase analysis proving a polynomial upper bound on the expected running time of an exact algorithm for the 0/1 knapsack problem. In particular, we prove for various input distributions, that the number of Paretooptimal knapsack fillings is polynomially bounded in the number of available items. An algorithm by Nemhauser and Ullmann can enumerate these solutions very efficiently so that a polynomial upper bound on the number of Paretooptimal solutions implies an algorithm with expected polynomial running time. The random input model underlying our analysis is quite general and not restricted to a particular input distribution. We assume adversarial weights and randomly drawn profits (or vice versa). Our analysis covers general probability distributions with finite mean and, in its most general form, can even handle different probability distributions for the profits of different items. This feature enables us to study the effects of correlations between profits and weights. Our analysis confirms and explains practical studies showing that socalled strongly correlated instances are harder to solve than weakly correlated ones.
Robust submodular observation selection
, 2008
"... In many applications, one has to actively select among a set of expensive observations before making an informed decision. For example, in environmental monitoring, we want to select locations to measure in order to most effectively predict spatial phenomena. Often, we want to select observations wh ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
(Show Context)
In many applications, one has to actively select among a set of expensive observations before making an informed decision. For example, in environmental monitoring, we want to select locations to measure in order to most effectively predict spatial phenomena. Often, we want to select observations which are robust against a number of possible objective functions. Examples include minimizing the maximum posterior variance in Gaussian Process regression, robust experimental design, and sensor placement for outbreak detection. In this paper, we present the Submodular Saturation algorithm, a simple and efficient algorithm with strong theoretical approximation guarantees for cases where the possible objective functions exhibit submodularity, an intuitive diminishing returns property. Moreover, we prove that better approximation algorithms do not exist unless NPcomplete problems admit efficient algorithms. We show how our algorithm can be extended to handle complex cost functions (incorporating nonunit observation cost or communication and path costs). We also show how the algorithm can be used to nearoptimally trade off expectedcase (e.g., the Mean Square Prediction Error in Gaussian Process regression) and worstcase (e.g., maximum predictive variance) performance. We show that many important machine learning problems fit our robust submodular observation selection formalism, and provide extensive empirical evaluation on several realworld problems. For Gaussian Process regression, our algorithm compares favorably with stateoftheart heuristics described in the geostatistics literature, while being simpler, faster and providing theoretical guarantees. For robust experimental design, our algorithm performs favorably compared to SDPbased algorithms.
Evolving Bin Packing Heuristics with Genetic Programming
 PARALLEL PROBLEM SOLVING FROM NATURE  PPSN IX SPRINGER LECTURE NOTES IN COMPUTER SCIENCE. VOLUME 4193 OF LNCS., REYKJAVIK, ICELAND, SPRINGERVERLAG (2006) 860–869
, 2006
"... The binpacking problem is a well known NPHard optimisation problem, and, over the years, many heuristics have been developed to generate good quality solutions. This paper outlines a genetic programming system which evolves a heuristic that decides whether to put a piece in a bin when presente ..."
Abstract

Cited by 40 (13 self)
 Add to MetaCart
The binpacking problem is a well known NPHard optimisation problem, and, over the years, many heuristics have been developed to generate good quality solutions. This paper outlines a genetic programming system which evolves a heuristic that decides whether to put a piece in a bin when presented with the sum of the pieces already in the bin and the size of the piece that is about to be packed. This heuristic operates in a fixed framework that iterates through the open bins, applying the heuristic to each one, before deciding which bin to use. The best evolved programs emulate the functionality of the human designed `firstfit' heuristic. Thus, the contribution of this paper is to demonstrate that genetic programming can be employed to automatically evolve bin packing heuristics which are the same as high quality heuristics which have been designed by humans.
A multiobjective evolutionary algorithm based on decomposition
 IEEE Transactions on Evolutionary Computation, Accepted
, 2007
"... 1 Decomposition is a basic strategy in traditional multiobjective optimization. However, this strategy has not yet widely used in multiobjective evolutionary optimization. This paper proposes a multiobjective evolutionary algorithm based on decomposition (MOEA/D). It decomposes a MOP into a number o ..."
Abstract

Cited by 40 (15 self)
 Add to MetaCart
(Show Context)
1 Decomposition is a basic strategy in traditional multiobjective optimization. However, this strategy has not yet widely used in multiobjective evolutionary optimization. This paper proposes a multiobjective evolutionary algorithm based on decomposition (MOEA/D). It decomposes a MOP into a number of scalar optimization subproblems and optimizes them simultaneously. Each subproblem is optimized by using information from its several neighboring subproblems, which makes MOEA/D have lower computational complexity at each generation than MOGLS and NSGAII. Experimental results show that it outperforms or performs similarly to MOGLS and NSGAII on multiobjective 01 knapsack problems and continuous multiobjective optimization problems. Index Terms multiobjective optimization, decomposition, evolutionary algorithms, memetic algorithms, Pareto optimality, computational complexity. I.
PeertoPeer Data Trading to Preserve Information
 ACM Transactions on Information Systems
"... Data archiving systems rely on replication to preserve information. This paper discusses how a network of autonomousarchiving sites can trade data to achieve the most reliable replication. A series of binary trades among sites produces a peertopeer archiving network. Two trading algorithms are e ..."
Abstract

Cited by 38 (10 self)
 Add to MetaCart
Data archiving systems rely on replication to preserve information. This paper discusses how a network of autonomousarchiving sites can trade data to achieve the most reliable replication. A series of binary trades among sites produces a peertopeer archiving network. Two trading algorithms are examined, one based on trading collections (even if they are different sizes) and another based on trading equal sized blocks of space (which can then store collections.) The concept of deeds is introduced; deeds track the blocks of space owned by one site at another. Policies for tuning these algorithms to provide the highest reliability, for example by changing the order in which sites are contacted and offered trades, are discussed. Finally, simulation results are presented that reveal which policies are best. The experiments indicate that a digital archive can achieve the best reliability by trading blocks of space (deeds), and that following certain policies will allow that site to maximize its reliability. Categories and Subject Descriptors: H.3.7 [Information storage and retrieval]: Digital libraries  systems issues; E.5 [Files]: Backup/recovery General Terms: Design, reliability Additional Key Words and Phrases: data replication, fault tolerance, digital archiving, digital library, resource negotiation 1