Results 1  10
of
138
The NPcompleteness column: an ongoing guide
 Journal of Algorithms
, 1985
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & ..."
Abstract

Cited by 196 (0 self)
 Add to MetaCart
(Show Context)
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co., New York, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
Hippodrome: Running Circles around Storage Administration
 In Proceedings of the Conference on File and Storage Technologies
, 2002
"... Enterprisescale computer storage systems are extremely difficult to manage due to their size and complexity. It is difficult to generate a good storage system design for a given workload and to correctly implement the selected design. Traditionally, initial system configuration is performed by admi ..."
Abstract

Cited by 133 (10 self)
 Add to MetaCart
(Show Context)
Enterprisescale computer storage systems are extremely difficult to manage due to their size and complexity. It is difficult to generate a good storage system design for a given workload and to correctly implement the selected design. Traditionally, initial system configuration is performed by administrators who are guided by rules of thumb. Unfortunately, this process involves trial and error, and as a result is tedious and errorprone. In this paper, we introduce Hippodrome, an approach to automating initial system configuration. Hippodrome is an iterative loop that analyzes an existing system to determine its requirements, creates a new storage system design to better meet these requirements, and migrates the existing system to the new design. In this paper, we show how Hippodrome automates initial system configuration. 1
THE PRIMALDUAL METHOD FOR APPROXIMATION ALGORITHMS AND ITS APPLICATION TO NETWORK DESIGN PROBLEMS
"... The primaldual method is a standard tool in the design of algorithms for combinatorial optimization problems. This chapter shows how the primaldual method can be modified to provide good approximation algorithms for a wide variety of NPhard problems. We concentrate on results from recent researc ..."
Abstract

Cited by 124 (7 self)
 Add to MetaCart
The primaldual method is a standard tool in the design of algorithms for combinatorial optimization problems. This chapter shows how the primaldual method can be modified to provide good approximation algorithms for a wide variety of NPhard problems. We concentrate on results from recent research applying the primaldual method to problems in network design.
On the SumofSquares algorithms for bin packing
, 2006
"... In this article we present a theoretical analysis of the online SumofSquares algorithm (SS) for bin packing along with several new variants. SS is applicable to any instance of bin packing in which the bin capacity B and item sizes s(a) are integral (or can be scaled to be so), and runs in time ..."
Abstract

Cited by 107 (7 self)
 Add to MetaCart
(Show Context)
In this article we present a theoretical analysis of the online SumofSquares algorithm (SS) for bin packing along with several new variants. SS is applicable to any instance of bin packing in which the bin capacity B and item sizes s(a) are integral (or can be scaled to be so), and runs in time O(nB). It performs remarkably well from an average case point of view: For any discrete distribution in which the optimal expected waste is sublinear, SS also has sublinear expected waste. For any discrete distribution where the optimal expected waste is bounded, SS has expected waste at most O(log n). We also discuss several interesting variants on SS, including a randomized O(nBlog B)time online algorithm SS ∗ whose expected behavior is essentially optimal for all discrete distributions. Algorithm SS ∗ depends on a new linearprogrammingbased pseudopolynomialtime algorithm for solving the
Parameterized Complexity: A Framework for Systematically Confronting Computational Intractability
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1997
"... In this paper we give a programmatic overview of parameterized computational complexity in the broad context of the problem of coping with computational intractability. We give some examples of how fixedparameter tractability techniques can deliver practical algorithms in two different ways: (1) by ..."
Abstract

Cited by 68 (16 self)
 Add to MetaCart
(Show Context)
In this paper we give a programmatic overview of parameterized computational complexity in the broad context of the problem of coping with computational intractability. We give some examples of how fixedparameter tractability techniques can deliver practical algorithms in two different ways: (1) by providing useful exact algorithms for small parameter ranges, and (2) by providing guidance in the design of heuristic algorithms. In particular, we describe an improved FPT kernelization algorithm for Vertex Cover, a practical FPT algorithm for the Maximum Agreement Subtree (MAST) problem parameterized by the number of species to be deleted, and new general heuristics for these problems based on FPT techniques. In the course of making this overview, we also investigate some structural and hardness issues. We prove that an important naturally parameterized problem in artificial intelligence, STRIPS Planning (where the parameter is the size of the plan) is complete for W [1]. As a corollary, this implies that kStep Reachability for Petri Nets is complete for W [1]. We describe how the concept of treewidth can be applied to STRIPS Planning and other problems of logic to obtain FPT results. We describe a surprising structural result concerning the top end of the parameterized complexity hierarchy: the naturally parameterized Graph kColoring problem cannot be resolved with respect to XP either by showing membership in XP, or by showing hardness for XP without settling the P = NP question one way or the other.
On the Online Bin Packing Problem
 Journal of the ACM
, 2001
"... A new framework for analyzing online bin packing algorithms is presented. This framework presents a unified way of explaining the performance of algorithms based on the Harmonic approach [3, 5, 8, 10, 11, 12]. Within this framework, it is shown that a new algorithm, Harmonic++, has asymptotic perfor ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
A new framework for analyzing online bin packing algorithms is presented. This framework presents a unified way of explaining the performance of algorithms based on the Harmonic approach [3, 5, 8, 10, 11, 12]. Within this framework, it is shown that a new algorithm, Harmonic++, has asymptotic performance ratio at most 1.58889. It is also shown that the analysis of Harmonic+1 presented in [11] is incorrect; this is a fundamental logical flaw, not an error in calculation or an omitted case. The asymptotic performance ratio of Harmonic+1 is at least 1.59217. Thus Harmonic++ provides the best upper bound for the online bin packing problem to date.
Optimal Search and OneWay Trading Online Algorithms
 ALGORITHMICA
, 2001
"... This paper is concerned with the time series search and oneway trading problems. In the (time series) search problem a player is searching for the maximum (or minimum) price in a sequence that unfolds sequentially, one price at a time. Once during this game the player can decide to accept the curre ..."
Abstract

Cited by 36 (0 self)
 Add to MetaCart
This paper is concerned with the time series search and oneway trading problems. In the (time series) search problem a player is searching for the maximum (or minimum) price in a sequence that unfolds sequentially, one price at a time. Once during this game the player can decide to accept the current price p in which case the game ends and the player's payoff is p.Intheoneway trading problem a trader is given the task of trading dollars to yen. Each day, a new exchange rate is announced and the trader must decide how many dollars to convert to yen according to the current rate. The game ends when the trader trades his entire dollar wealth to yen and his payoff is the number of yen acquired. The search and oneway trading are intimately related. Any (deterministic or randomized) oneway trading algorithm can be viewed as a randomized search algorithm. Using the competitive ratio as a performance measure we determine the optimal competitive performance for several variants of these problems. In particular, we show that a simple threatbased strategy is optimal and we determine its competitive ratio which yields, for realistic values of the problem parameters, surprisingly low competitive ratios. We also consider and analyze a oneway trading game played against an adversary called Nature where the online player knows the probability distribution of the maximum exchange rate and that distribution has been chosen by Nature. Finally, we consider some applications for a special case of portfolio selection called twoway trading in which the trader may trade back and forth between cash and one asset.
Selecting RAID levels for disk arrays
, 2002
"... Disk arrays have a myriad of configuration parameters that interact in counterintuitive ways, and those interactions can have significant impacts on cost, performance, and reliability. Even after values for these parameters have been chosen, there are exponentiallymany ways to map data onto the di ..."
Abstract

Cited by 34 (7 self)
 Add to MetaCart
Disk arrays have a myriad of configuration parameters that interact in counterintuitive ways, and those interactions can have significant impacts on cost, performance, and reliability. Even after values for these parameters have been chosen, there are exponentiallymany ways to map data onto the disk arrays' logical units. Meanwhile, the importance of correct choices is increasing: storage systems represent an growing fraction of total system cost, they need to respond more rapidly to changing needs, and there is less and less tolerance for mistakes. We believe that automatic design and configuration of storage systems is the only viable solution to these issues. To that end, we present a comparative study of a range of techniques for programmatically choosing the RAID levels to use in a disk array. Our simplest approaches are modeled on existing, manual rules of thumb: they "tag" data with a RAID level before determining the configuration of the array to which it is assigned. Our best approach simultaneously determines the RAID levels for the data, the array configuration, and the layout of data on that array. It operates as an optimization process with the twin goals of minimizing array cost while ensuring that storage workload performance requirements will be met. This approach produces robust solutions with an average cost/performance 14 17% better than the best results for the tagging schemes, and up to 150200% better than their worst solutions. We believe that this is the first presentation and systematic analysis of a variety of novel, fullyautomatic RAID level selection techniques. 1
Evolving Bin Packing Heuristics with Genetic Programming
 PARALLEL PROBLEM SOLVING FROM NATURE  PPSN IX SPRINGER LECTURE NOTES IN COMPUTER SCIENCE. VOLUME 4193 OF LNCS., REYKJAVIK, ICELAND, SPRINGERVERLAG (2006) 860–869
, 2006
"... The binpacking problem is a well known NPHard optimisation problem, and, over the years, many heuristics have been developed to generate good quality solutions. This paper outlines a genetic programming system which evolves a heuristic that decides whether to put a piece in a bin when presente ..."
Abstract

Cited by 32 (12 self)
 Add to MetaCart
The binpacking problem is a well known NPHard optimisation problem, and, over the years, many heuristics have been developed to generate good quality solutions. This paper outlines a genetic programming system which evolves a heuristic that decides whether to put a piece in a bin when presented with the sum of the pieces already in the bin and the size of the piece that is about to be packed. This heuristic operates in a fixed framework that iterates through the open bins, applying the heuristic to each one, before deciding which bin to use. The best evolved programs emulate the functionality of the human designed `firstfit' heuristic. Thus, the contribution of this paper is to demonstrate that genetic programming can be employed to automatically evolve bin packing heuristics which are the same as high quality heuristics which have been designed by humans.
BestFit BinPacking with Random Order
 In 7th Annual ACMSIAM Symposium on Discrete Algorithms
, 1997
"... Bestfit is the best known algorithm for online binpacking, in the sense that no algorithm is known to behave better both in the worst case (when Bestfit has performance ratio 1.7) and in the average uniform case, with items drawn uniformly in the interval [0; 1] (then Bestfit has expected wasted ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
(Show Context)
Bestfit is the best known algorithm for online binpacking, in the sense that no algorithm is known to behave better both in the worst case (when Bestfit has performance ratio 1.7) and in the average uniform case, with items drawn uniformly in the interval [0; 1] (then Bestfit has expected wasted space O(n 1=2 (log n) 3=4 )). In practical applications, Bestfit appears to perform within a few percent of optimal. In this paper, in the spirit of previous work in computational geometry, we study the expected performance ratio, taking the worstcase multiset of items L, and assuming that the elements of L are inserted in random order, with all permutations equally likely. We show a lower bound of 1:08 : : : and an upper bound of 1:5 on the random order performance ratio of Bestfit. The upper bound contrasts with the result that in the worst case, any (deterministic or randomized) online binpacking algorithm has performance ratio at least 1:54 : : :. 1 Introduction 1.1 Backgroun...