Results 1  10
of
30
On problems without polynomial kernels
 Lect. Notes Comput. Sci
, 2007
"... Abstract. Kernelization is a strong and widelyapplied technique in parameterized complexity. In a nutshell, a kernelization algorithm, or simply a kernel, is a polynomialtime transformation that transforms any given parameterized instance to an equivalent instance of the same problem, with size an ..."
Abstract

Cited by 65 (9 self)
 Add to MetaCart
Abstract. Kernelization is a strong and widelyapplied technique in parameterized complexity. In a nutshell, a kernelization algorithm, or simply a kernel, is a polynomialtime transformation that transforms any given parameterized instance to an equivalent instance of the same problem, with size and parameter bounded by a function of the parameter in the input. A kernel is polynomial if the size and parameter of the output are polynomiallybounded by the parameter of the input. In this paper we develop a framework which allows showing that a wide range of FPT problems do not have polynomial kernels. Our evidence relies on hypothesis made in the classical world (i.e. nonparametric complexity), and evolves around a new type of algorithm for classical decision problems, called a distillation algorithm, which might be of independent interest. Using the notion of distillation algorithms, we develop a generic lowerbound engine which allows us to show that a variety of FPT problems, fulfilling certain criteria, cannot have polynomial kernels unless the polynomial hierarchy collapses. These problems include kPath, kCycle, kExact Cycle, kShort Cheap Tour, kGraph Minor Order Test, kCutwidth, kSearch Number, kPathwidth, kTreewidth, kBranchwidth, and several optimization problems parameterized by treewidth or cliquewidth. 1
A more effective linear kernelization for Cluster Editing
 Theor. Comput. Sci
, 2009
"... Abstract. In the NPhard Cluster Editing problem, we have as input an undirected graph G andanintegerk ≥ 0. The question is whether we can transform G, by inserting and deleting at most k edges, into a cluster graph, that is, a union of disjoint cliques. We first confirm a conjecture by Michael Fell ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
Abstract. In the NPhard Cluster Editing problem, we have as input an undirected graph G andanintegerk ≥ 0. The question is whether we can transform G, by inserting and deleting at most k edges, into a cluster graph, that is, a union of disjoint cliques. We first confirm a conjecture by Michael Fellows [IWPEC 2006] that there is a polynomialtime kernelization for Cluster Editing that leads to a problem kernel with at most 6k vertices. More precisely, we present a cubictime algorithm that, given a graph G andanintegerk ≥ 0, finds a graph G ′ and an integer k ′ ≤ k such that G can be transformed into a cluster graph by at most k edge modifications iff G ′ can be transformed into a cluster graph by at most k ′ edge modifications, and the problem kernel G ′ has at most 6k vertices. So far, only a problem kernel of 24k vertices was known. Second, we show that this bound for the number of vertices of G ′ can be further improved to 4k. Finally, we consider the variant of Cluster Editing where the number of cliques that the cluster graph can contain is stipulated to be a constant d>0. We present a simple kernelization for this variant leaving a problem kernel of at most (d +2)k + d vertices. 1
Parameterized complexity and approximation algorithms
 Comput. J
, 2006
"... Approximation algorithms and parameterized complexity are usually considered to be two separate ways of dealing with hard algorithmic problems. In this paper, our aim is to investigate how these two fields can be combined to achieve better algorithms than what any of the two theories could offer. We ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
Approximation algorithms and parameterized complexity are usually considered to be two separate ways of dealing with hard algorithmic problems. In this paper, our aim is to investigate how these two fields can be combined to achieve better algorithms than what any of the two theories could offer. We discuss the different ways parameterized complexity can be extended to approximation algorithms, survey results of this type and propose directions for future research. 1.
Incompressibility through Colors and IDs
"... In parameterized complexity each problem instance comes with a parameter k and the parameterized problem is said to admit a polynomial kernel if there are polynomial time preprocessing rules that reduce the input instance down to an instance with size polynomial in k. Many problems have been shown t ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
In parameterized complexity each problem instance comes with a parameter k and the parameterized problem is said to admit a polynomial kernel if there are polynomial time preprocessing rules that reduce the input instance down to an instance with size polynomial in k. Many problems have been shown to admit polynomial kernels, but it is only recently that a framework for showing the nonexistence of polynomial kernels for specific problems has been developed by Bodlaender et al. [6] and Fortnow and Santhanam [15]. With few exceptions, all known kernelization lower bounds result have been obtained by directly applying this framework. In this paper we show how to combine these results with combinatorial reductions which use colors and IDs in order to prove kernelization lower bounds for a variety of basic problems. Below we give a summary of our main results. All our results are under the assumption that the polynomial hierarchy does not collapse to the third level. • We show that the Steiner Tree problem parameterized by the number of terminals and solution size, and the Connected Vertex Cover and Capacitated Vertex Cover problems do not admit a polynomial kernel. The two latter results are surprising because the closely related Vertex Cover problem admits a kernel of size 2k.
Bidimensionality and Kernels
, 2010
"... Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi ..."
Abstract

Cited by 21 (12 self)
 Add to MetaCart
Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems. In this paper, we establish a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In parameterized complexity, each problem instance comes with a parameter k and the parameterized problem is said to admit a linear kernel if there is a polynomial time algorithm, called
Experiments on Data Reduction for Optimal Domination in Networks
 in Proceedings International Network Optimization Conference (INOC 2003), Evry/Paris
, 2003
"... We present empirical results on computing optimal dominating sets in networks by means of data reduction through preprocessing rules. Thus, we demonstrate the usefulness of so far only theoretically considered reduction techniques for practically solving one of the most important network problems ..."
Abstract

Cited by 18 (14 self)
 Add to MetaCart
We present empirical results on computing optimal dominating sets in networks by means of data reduction through preprocessing rules. Thus, we demonstrate the usefulness of so far only theoretically considered reduction techniques for practically solving one of the most important network problems in combinatorial optimization.
Subexponential parameterized algorithms
 Computer Science Review
"... We give a review of a series of techniques and results on the design of subexponential parameterized algorithms for graph problems. The design of such algorithms usually consists of two main steps: first find a branch (or tree) decomposition of the input graph whose width is bounded by a sublinear ..."
Abstract

Cited by 18 (8 self)
 Add to MetaCart
We give a review of a series of techniques and results on the design of subexponential parameterized algorithms for graph problems. The design of such algorithms usually consists of two main steps: first find a branch (or tree) decomposition of the input graph whose width is bounded by a sublinear function of the parameter and, second, use this decomposition to solve the problem in time that is single exponential to this bound. The main tool for the first step is Bidimensionality Theory. Here we present the potential, but also the boundaries, of this theory. For the second step, we describe recent techniques, associating the analysis of subexponential algorithms to combinatorial bounds related to Catalan numbers. As a result, we have 2 O( √ k) · n O(1) time algorithms for a wide variety of parameterized problems on graphs, where n is the size of the graph and k is the parameter. 1
Linear problem kernels for NPhard problems on planar graphs
 In Proc. 34th ICALP, volume 4596 of LNCS
, 2007
"... Abstract. We develop a generic framework for deriving linearsize problem kernels for NPhard problems on planar graphs. We demonstrate the usefulness of our framework in several concrete case studies, giving new kernelization results for Connected Vertex Cover, Minimum Edge Dominating Set, Maximum ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
Abstract. We develop a generic framework for deriving linearsize problem kernels for NPhard problems on planar graphs. We demonstrate the usefulness of our framework in several concrete case studies, giving new kernelization results for Connected Vertex Cover, Minimum Edge Dominating Set, Maximum Triangle Packing, and Efficient Dominating Set on planar graphs. On the route to these results, we present effective, problemspecific data reduction rules that are useful in any approach attacking the computational intractability of these problems. 1
A general data reduction scheme for domination in graphs
 In Proc. 32nd SOFSEM, volume 3831 of LNCS
, 2006
"... Abstract. Data reduction by polynomialtime preprocessing is a core concept of (parameterized) complexity analysis in solving NPhard problems. Its practical usefulness is confirmed by experimental work. Here, generalizing and extending previous work, we present a set of data reduction preprocessing ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract. Data reduction by polynomialtime preprocessing is a core concept of (parameterized) complexity analysis in solving NPhard problems. Its practical usefulness is confirmed by experimental work. Here, generalizing and extending previous work, we present a set of data reduction preprocessing rules on the way to compute optimal dominating sets in graphs. In this way, we arrive at the novel notion of “data reduction schemes. ” In addition, we obtain data reduction results for domination in directed graphs that allow to prove a linearsize problem kernel for Directed Dominating Set in planar graphs. 1