Results 1  10
of
37
Bidimensionality and Kernels
, 2010
"... Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi ..."
Abstract

Cited by 61 (24 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems. In this paper, we establish a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In parameterized complexity, each problem instance comes with a parameter k and the parameterized problem is said to admit a linear kernel if there is a polynomial time algorithm, called
Efficient exact algorithms on planar graphs: Exploiting sphere cut branch decompositions
 IN PROCEEDINGS OF THE 13TH ANNUAL EUROPEAN SYMPOSIUM ON ALGORITHMS (ESA 2005
, 2005
"... A divideandconquer strategy based on variations of the LiptonTarjan planar separator theorem has been one of the most common approaches for solving planar graph problems for more than 20 years. We present a new framework for designing fast subexponential exact and parameterized algorithms on pla ..."
Abstract

Cited by 47 (18 self)
 Add to MetaCart
(Show Context)
A divideandconquer strategy based on variations of the LiptonTarjan planar separator theorem has been one of the most common approaches for solving planar graph problems for more than 20 years. We present a new framework for designing fast subexponential exact and parameterized algorithms on planar graphs. Our approach is based on geometric properties of planar branch decompositions obtained by Seymour & Thomas, combined with refined techniques of dynamic programming on planar graphs based on properties of noncrossing partitions. Compared to divideandconquer algorithms, the main advantages of our method are a) it is a generic method which allows to attack broad classes of problems; b) the obtained algorithms provide a better worst case analysis. To exemplify our approach we show how to obtain an O(26.903pn) time algorithm solving weighted Hamiltonian Cycle. We observe how our technique can be used to solve Planar Graph TSP in time O(29.8594pn). Our approach can be used to design parameterized algorithms as well. For example we introduce the first 2O(pk)nO(1) time algorithm for parameterized Planar kcycle by showing that for a given k we can decide if a planar graph on n vertices has a cycle of length at least k in time O(213.6pkn + n3).
Algorithmic MetaTheorems
 In M. Grohe and R. Neidermeier eds, International Workshop on Parameterized and Exact Computation (IWPEC), volume 5018 of LNCS
, 2008
"... Algorithmic metatheorems are algorithmic results that apply to a whole range of problems, instead of addressing just one specific problem. This kind of theorems are often stated relative to a certain class of graphs, so the general form of a meta theorem reads “every problem in a certain class C of ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
(Show Context)
Algorithmic metatheorems are algorithmic results that apply to a whole range of problems, instead of addressing just one specific problem. This kind of theorems are often stated relative to a certain class of graphs, so the general form of a meta theorem reads “every problem in a certain class C of problems can be solved efficiently on every graph satisfying a certain property P”. A particularly well known example of a metatheorem is Courcelle’s theorem that every decision problem definable in monadic secondorder logic (MSO) can be decided in linear time on any class of graphs of bounded treewidth [1]. The class C of problems can be defined in a number of different ways. One option is to state combinatorial or algorithmic criteria of problems in C. For instance, Demaine, Hajiaghayi and Kawarabayashi [5] showed that every minimisation problem that can be solved efficiently on graph classes of bounded treewidth and for which approximate solutions can be computed efficiently from solutions of certain subinstances, have a PTAS on any class of graphs excluding a fixed minor. While this gives a strong unifying explanation for PTAS of many
Fast FAST
"... We present a randomized subexponential time, polynomial space parameterized algorithm for the kWeighted Feedback Arc Set in Tournaments (kFAST) problem. We also show that our algorithm can be derandomized by slightly increasing the running time. To derandomize our algorithm we construct a new kin ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
(Show Context)
We present a randomized subexponential time, polynomial space parameterized algorithm for the kWeighted Feedback Arc Set in Tournaments (kFAST) problem. We also show that our algorithm can be derandomized by slightly increasing the running time. To derandomize our algorithm we construct a new kind of universal hash functions, that we coin universal coloring families. For integers m, k and r, a family F of functions from [m] to [r] is called a universal (m, k, r)coloring family if for any graph G on the set of vertices [m] with at most k edges, there exists an f ∈ F which is a proper vertex coloring of G. Our algorithm is the first nontrivial subexponential time parameterized algorithm outside the framework of bidimensionality.
Bidimensionality and EPTAS
"... Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] e ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework for the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems, and subsequently improved these results to EPTASs. Fomin et. al [SODA 2010] established a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In this paper we revisit bidimensionality theory from the perspective of approximation algorithms and redesign the framework for obtaining EPTASs to be more powerful, easier to apply and easier to understand. One of the important conditions required in the framework developed by Demaine and Hajiaghayi [SODA 2005] is that to obtain an EPTAS for a graph optimization problem Π, we have to know a constantfactor approximation algorithm for Π. Our approach eliminates this strong requirement, which makes it amenable to more problems. At the heart of our framework is a decomposition lemma which states that for “most ” bidimensional problems, there is a polynomial time algorithm which given an Hminorfree graph G as input and an ɛ> 0 outputs a vertex set X of size ɛ · OP T such that the treewidth of G \ X is O(1/ɛ). Here, OP T is the objective function value of the problem in question This allows us to obtain EPTASs on (apex)minorfree graphs for all problems covered by the previous framework, as well as for a wide range of packing problems, partial covering problems and problems that are neither closed under taking minors, nor contractions. To the best of our knowledge for many of these problems including Cycle Packing, VertexH
Subexponential Algorithms for Partial Cover Problems
"... Partial Cover problems are optimization versions of fundamental and well studied problems like Vertex Cover and Dominating Set. Here one is interested in covering (or dominating) the maximum number of edges (or vertices) using a given number (k) of vertices, rather than covering all edges (or vertic ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
Partial Cover problems are optimization versions of fundamental and well studied problems like Vertex Cover and Dominating Set. Here one is interested in covering (or dominating) the maximum number of edges (or vertices) using a given number (k) of vertices, rather than covering all edges (or vertices). In general graphs, these problems are hard for parameterized complexity classes when parameterized by k. It was recently shown by Amini et. al. [FSTTCS 08] that Partial Vertex Cover and Partial Dominating Set are fixed parameter tractable on large classes of sparse graphs, namely Hminor free graphs, which include planar graphs and graphs of bounded genus. In particular, it was shown that on planar graphs both problems can be solved in time 2 O(k) n O(1). During the last decade there has been an extensive study on parameterized subexponential algorithms. In particular, it was shown that the classical Vertex Cover and Dominating Set problems can be solved in subexponential time on Hminor free graphs. The techniques developed to obtain subexponential algorithms for classical problems do not apply to partial cover problems. It was left as an open problem by Amini et al. [FSTTCS 08] whether there is a subexponential algorithm for Partial Vertex Cover and Partial Dominating Set. In this paper, we answer the question affirmatively by solving both problems in time 2 O( √ k) n O(1) not only on planar graphs but also on much larger classes of graphs, namely, apexminor free graphs. Compared to previously known algorithms for these problems our algorithms are significantly faster and simpler. 1
Contraction Bidimensionality: The Accurate Picture
 Proceedings of the 17th Annual European Symposium on Algorithms, Lecture Notes in Computer Science
, 2009
"... Abstract. We provide new combinatorial theorems on the structure of graphs that are contained as contractions in graphs of large treewidth. As a consequence of our combinatorial results we unify and significantly simplify contraction bidimensionality theory—the meta algorithmic framework to design ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We provide new combinatorial theorems on the structure of graphs that are contained as contractions in graphs of large treewidth. As a consequence of our combinatorial results we unify and significantly simplify contraction bidimensionality theory—the meta algorithmic framework to design efficient parameterized and approximation algorithms for contraction closed parameters. 1
Implicit Branching and Parameterized Partial Cover Problems
 IN PROC. OF IARCS CONFERENCE ON FOUNDATIONS OF SOFTWARE TECHNOLOGY AND THEORETICAL COMPUTER SCIENCE (FSTTCS), LEIBNIZ INTERNATIONAL PROCEEDINGS IN INFORMATICS, SCHLOSS DAGSTUHL–LEIBNIZZENTRUM FUER INFORMATIK
"... Covering problems are fundamental classical problems in optimization, computer science and complexity theory. Typically an input to these problems is a family of sets over a finite universe and the goal is to cover the elements of the universe with as few sets of the family as possible. The variatio ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Covering problems are fundamental classical problems in optimization, computer science and complexity theory. Typically an input to these problems is a family of sets over a finite universe and the goal is to cover the elements of the universe with as few sets of the family as possible. The variations of covering problems include well known problems like Set Cover, Vertex Cover, Dominating Set and Facility Location to name a few. Recently there has been a lot of study on partial covering problems, a natural generalization of covering problems. Here, the goal is not to cover all the elements but to cover the specified number of elements with the minimum number of sets. In this paper we study partial covering problems in graphs in the realm of parameterized complexity. Classical (nonpartial) version of all these problems have been intensively studied in planar graphs and in graphs excluding a fixed graph H as a minor. However, the techniques developed for parameterized version of nonpartial covering problems cannot be applied directly to their partial counterparts. The approach we use, to show that various partial covering problems are fixed parameter tractable on planar graphs, graphs of bounded local treewidth and graph excluding some graph as a minor, is quite different from previously known techniques. The main idea behind our approach is the concept of implicit branching. We find implicit branching technique to be interesting on its own and believe that it can be used for some other problems.
Beyond Bidimensionality: Parameterized Subexponential Algorithms on Directed Graphs
"... In 2000 Alber et al. [SWAT 2000] obtained the first parameterized subexponential algorithm on undirected planar graphs by showing that kDOMINATING SET is solvable in time 2 O( √ k) ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
In 2000 Alber et al. [SWAT 2000] obtained the first parameterized subexponential algorithm on undirected planar graphs by showing that kDOMINATING SET is solvable in time 2 O( √ k)
Faster Parameterized Algorithms for Minor Containment
"... Abstract. The theory of Graph Minors by Robertson and Seymour is one of the deepest and significant theories in modern Combinatorics. This theory has also a strong impact on the recent development of Algorithms, and several areas, like Parameterized Complexity, have roots in Graph Minors. Until very ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
Abstract. The theory of Graph Minors by Robertson and Seymour is one of the deepest and significant theories in modern Combinatorics. This theory has also a strong impact on the recent development of Algorithms, and several areas, like Parameterized Complexity, have roots in Graph Minors. Until very recently it was a common believe that Graph Minors Theory is of mainly theoretical importance. However, it appears that many deep results from RobertsonSeymour’s theory can be also used in the design of practical algorithms. Minor containment testing is one of the most algorithmically important and technical parts of the theory, and minor containment in graphs of bounded branchwidth is the basic ingredient of this algorithm. In order to implement minor containment testing on graphs of bounded branchwidth, Hicks [NETWORKS 04] described an algorithm, that in time O(3 k2 · (h + k − 1)! · m) decides if a graph G with m edges and branchwidth k, contains a fixed graph H on h vertices as a minor. That algorithm follows the ideas introduced by Robertson and Seymour in [J’CTSB 95]. In this work we improve the dependence on k of Hicks ’ result by showing that checking if H is a minor of G can be done in time O(2 (2k+1)·log k · h 2k · 2 2h2 · m). Our approach is based on a combinatorial object called