Results 1  10
of
1,013
Invitation to FixedParameter Algorithms
, 2002
"... Contents 1. Foundations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Keep the Parameter Fixed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Preliminaries and Agreements . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..."
Abstract

Cited by 302 (69 self)
 Add to MetaCart
Contents 1. Foundations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Keep the Parameter Fixed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Preliminaries and Agreements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Parameterized Complexitya Brief Overview . . . . . . . . . . . . . . 6 1.3.1 Basic Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.3.2 Interpreting FixedParameter Tractability . . . . . . . . . . . 9 1.4 Vertex Cover  an Illustrative Example . . . . . . . . . . . . . . . . . 11 1.4.1 Parameterize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.4.2 Specialize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.4.3 Generalize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 1.4.4 Count or Enumerate . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Expander Flows, Geometric Embeddings and Graph Partitioning
 IN 36TH ANNUAL SYMPOSIUM ON THE THEORY OF COMPUTING
, 2004
"... We give a O( log n)approximation algorithm for sparsest cut, balanced separator, and graph conductance problems. This improves the O(log n)approximation of Leighton and Rao (1988). We use a wellknown semidefinite relaxation with triangle inequality constraints. Central to our analysis is a ..."
Abstract

Cited by 242 (18 self)
 Add to MetaCart
We give a O( log n)approximation algorithm for sparsest cut, balanced separator, and graph conductance problems. This improves the O(log n)approximation of Leighton and Rao (1988). We use a wellknown semidefinite relaxation with triangle inequality constraints. Central to our analysis is a geometric theorem about projections of point sets in , whose proof makes essential use of a phenomenon called measure concentration.
Complexity and Approximation
, 1999
"... Abstract. In this survey the following model is considered. We assume that an instance I of a computationally hard optimization problem has been solved and that we know the optimum solution of such instance. Then a new instance I ′ is proposed, obtained by means of a slight perturbation of instance ..."
Abstract

Cited by 180 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In this survey the following model is considered. We assume that an instance I of a computationally hard optimization problem has been solved and that we know the optimum solution of such instance. Then a new instance I ′ is proposed, obtained by means of a slight perturbation of instance I. How can we exploit the knowledge we have on the solution of instance I to compute a (approximate) solution of instance I ′ in an efficient way? This computation model is called reoptimization and is of practical interest in various circumstances. In this article we first discuss what kind of performance we can expect for specific classes of problems and then we present some classical optimization problems (i.e. Max Knapsack, Min Steiner Tree, Scheduling) in which this approach has been fruitfully applied. Subsequently, we address vehicle routing problems and we show how the reoptimization approach can be used to obtain good approximate solution in an efficient way for some of these problems. 1
Winner determination in combinatorial auction generalizations
, 2002
"... Combinatorial markets where bids can be submitted on bundles of items can be economically desirable coordination mechanisms in multiagent systems where the items exhibit complementarity and substitutability. There has been a surge of recent research on winner determination in combinatorial auctions. ..."
Abstract

Cited by 165 (24 self)
 Add to MetaCart
Combinatorial markets where bids can be submitted on bundles of items can be economically desirable coordination mechanisms in multiagent systems where the items exhibit complementarity and substitutability. There has been a surge of recent research on winner determination in combinatorial auctions. In this paper we study a wider range of combinatorial market designs: auctions, reverse auctions, and exchanges, with one or multiple units of each item, with and without free disposal. We first theoretically characterize the complexity. The most interesting results are that reverse auctions with free disposal can be approximated, and in all of the cases without free disposal, even finding a feasible solution is ÆÈcomplete. We then ran experiments on known benchmarks as well as ones which we introduced, to study the complexity of the market variants in practice. Cases with free disposal tended to be easier than ones without. On many distributions, reverse auctions with free disposal were easier than auctions with free disposal— as the approximability would suggest—but interestingly, on one of the most realistic distributions they were harder. Singleunit exchanges were easy, but multiunit exchanges were extremely hard. 1
Computational Complexity  A Modern Approach
, 2009
"... Not to be reproduced or distributed without the authors ’ permissioniiTo our wives — Silvia and RavitivAbout this book Computational complexity theory has developed rapidly in the past three decades. The list of surprising and fundamental results proved since 1990 alone could fill a book: these incl ..."
Abstract

Cited by 161 (2 self)
 Add to MetaCart
Not to be reproduced or distributed without the authors ’ permissioniiTo our wives — Silvia and RavitivAbout this book Computational complexity theory has developed rapidly in the past three decades. The list of surprising and fundamental results proved since 1990 alone could fill a book: these include new probabilistic definitions of classical complexity classes (IP = PSPACE and the PCP Theorems) and their implications for the field of approximation algorithms; Shor’s algorithm to factor integers using a quantum computer; an understanding of why current approaches to the famous P versus NP will not be successful; a theory of derandomization and pseudorandomness based upon computational hardness; and beautiful constructions of pseudorandom objects such as extractors and expanders. This book aims to describe such recent achievements of complexity theory in the context of more classical results. It is intended to both serve as a textbook and as a reference for selfstudy. This means it must simultaneously cater to many audiences, and it is carefully designed with that goal. We assume essentially no computational background and very minimal mathematical background, which we review in Appendix A. We have also provided a web site for this book at
Efficient Algorithms for Online Decision Problems
 J. Comput. Syst. Sci
, 2003
"... In an online decision problem, one makes a sequence of decisions without knowledge of the future. Tools from learning such as Weighted Majority and its many variants [13, 18, 4] demonstrate that online algorithms can perform nearly as well as the best single decision chosen in hindsight, even when t ..."
Abstract

Cited by 138 (3 self)
 Add to MetaCart
In an online decision problem, one makes a sequence of decisions without knowledge of the future. Tools from learning such as Weighted Majority and its many variants [13, 18, 4] demonstrate that online algorithms can perform nearly as well as the best single decision chosen in hindsight, even when there are exponentially many possible decisions. However, the naive application of these algorithms is inefficient for such large problems. For some problems with nice structure, specialized efficient solutions have been developed [10, 16, 17, 6, 3].
THE PRIMALDUAL METHOD FOR APPROXIMATION ALGORITHMS AND ITS APPLICATION TO NETWORK DESIGN PROBLEMS
"... The primaldual method is a standard tool in the design of algorithms for combinatorial optimization problems. This chapter shows how the primaldual method can be modified to provide good approximation algorithms for a wide variety of NPhard problems. We concentrate on results from recent researc ..."
Abstract

Cited by 124 (7 self)
 Add to MetaCart
The primaldual method is a standard tool in the design of algorithms for combinatorial optimization problems. This chapter shows how the primaldual method can be modified to provide good approximation algorithms for a wide variety of NPhard problems. We concentrate on results from recent research applying the primaldual method to problems in network design.
The communication requirements of efficient allocations and supporting prices
 Journal of Economic Theory
, 2006
"... We show that any communication finding a Pareto efficient allocation in a privateinformation economy must also discover supporting Lindahl prices. In particular, efficient allocation of L indivisible objects requires naming a price for each of the 2 L ¡1 bundles. Furthermore, exponential communicat ..."
Abstract

Cited by 124 (16 self)
 Add to MetaCart
We show that any communication finding a Pareto efficient allocation in a privateinformation economy must also discover supporting Lindahl prices. In particular, efficient allocation of L indivisible objects requires naming a price for each of the 2 L ¡1 bundles. Furthermore, exponential communication in L is needed just to ensure a higher share of surplus than that realized by auctioning all items as a bundle, or even a higher expected surplus (for some probability distribution over valuations). When the valuations are submodular, efficiency still requires exponential communication (and fully polynomial approximation is impossible). When the objects are homogeneous, arbitrarily good approximation is obtained using exponentially less communication than that needed for exact efficiency.