Results 1  10
of
39
Kernelization: New Upper and Lower Bound Techniques
 In Proc. of the 4th International Workshop on Parameterized and Exact Computation (IWPEC), volume 5917 of LNCS
, 2009
"... Abstract. In this survey, we look at kernelization: algorithms that transform in polynomial time an input to a problem to an equivalent input, whose size is bounded by a function of a parameter. Several results of recent research on kernelization are mentioned. This survey looks at some recent resu ..."
Abstract

Cited by 54 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In this survey, we look at kernelization: algorithms that transform in polynomial time an input to a problem to an equivalent input, whose size is bounded by a function of a parameter. Several results of recent research on kernelization are mentioned. This survey looks at some recent results where a general technique shows the existence of kernelization algorithms for large classes of problems, in particular for planar graphs and generalizations of planar graphs, and recent lower bound techniques that give evidence that certain types of kernelization algorithms do not exist.
Improving Exhaustive Search Implies Superpolynomial Lower Bounds
, 2009
"... The P vs NP problem arose from the question of whether exhaustive search is necessary for problems with short verifiable solutions. We do not know if even a slight algorithmic improvement over exhaustive search is universally possible for all NP problems, and to date no major consequences have been ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
The P vs NP problem arose from the question of whether exhaustive search is necessary for problems with short verifiable solutions. We do not know if even a slight algorithmic improvement over exhaustive search is universally possible for all NP problems, and to date no major consequences have been derived from the assumption that an improvement exists. We show that there are natural NP and BPP problems for which minor algorithmic improvements over the trivial deterministic simulation already entail lower bounds such as NEXP ̸ ⊆ P/poly and LOGSPACE ̸ = NP. These results are especially interesting given that similar improvements have been found for many other hard problems. Optimistically, one might hope our results suggest a new path to lower bounds; pessimistically, they show that carrying out the seemingly modest program of finding slightly better algorithms for all search problems may be extremely difficult (if not impossible). We also prove unconditional superpolynomial timespace lower bounds for improving on exhaustive search: there is a problem verifiable with k(n) length witnesses in O(n a) time (for some a and some function k(n) ≤ n) that cannot be solved in k(n) c n a+o(1) time and k(n) c n o(1) space, for every c ≥ 1. While such problems can always be solved by exhaustive search in O(2 k(n) n a) time and O(k(n) + n a) space, we can prove a superpolynomial lower bound in the parameter k(n) when space usage is restricted.
On the possibility of faster SAT algorithms
"... We describe reductions from the problem of determining the satisfiability of Boolean CNF formulas (CNFSAT) to several natural algorithmic problems. We show that attaining any of the following bounds would improve the state of the art in algorithms for SAT: • an O(n k−ε) algorithm for kDominating S ..."
Abstract

Cited by 37 (3 self)
 Add to MetaCart
We describe reductions from the problem of determining the satisfiability of Boolean CNF formulas (CNFSAT) to several natural algorithmic problems. We show that attaining any of the following bounds would improve the state of the art in algorithms for SAT: • an O(n k−ε) algorithm for kDominating Set, for any k ≥ 3, • a (computationally efficient) protocol for 3party set disjointness with o(m) bits of communication, • an n o(d) algorithm for dSUM, • an O(n 2−ε) algorithm for 2SAT with m = n 1+o(1) clauses, where two clauses may have unrestricted length, and • an O((n + m) k−ε) algorithm for HornSat with k unrestricted length clauses. One may interpret our reductions as new attacks on the complexity of SAT, or sharp lower bounds conditional on exponential hardness of SAT.
Known Algorithms on Graphs of Bounded Treewidth are Probably Optimal
, 2010
"... We obtain a number of lower bounds on the running time of algorithms solving problems on graphs of bounded treewidth. We prove the results under the Strong Exponential Time Hypothesis of Impagliazzo and Paturi. In particular, assuming that SAT cannot be solved in (2−ǫ) n m O(1) time, we show that fo ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
We obtain a number of lower bounds on the running time of algorithms solving problems on graphs of bounded treewidth. We prove the results under the Strong Exponential Time Hypothesis of Impagliazzo and Paturi. In particular, assuming that SAT cannot be solved in (2−ǫ) n m O(1) time, we show that for any ǫ> 0; • INDEPENDENT SET cannot be solved in (2 − ǫ) tw(G) V (G)  O(1) time, • DOMINATING SET cannot be solved in (3 − ǫ) tw(G) V (G)  O(1) time, • MAX CUT cannot be solved in (2 − ǫ) tw(G) V (G)  O(1) time, • ODD CYCLE TRANSVERSAL cannot be solved in (3 − ǫ) tw(G) V (G)  O(1) time, • For any q ≥ 3, qCOLORING cannot be solved in (q − ǫ) tw(G) V (G)  O(1) time, • PARTITION INTO TRIANGLES cannot be solved in (2 − ǫ) tw(G) V (G)  O(1) time. Our lower bounds match the running times for the best known algorithms for the problems, up to the ǫ in the base.
Averagecase complexity of detecting cliques
, 2010
"... The computational problem of testing whether a graph contains a complete subgraph of size k is among the most fundamental problems studied in theoretical computer science. This thesis is concerned with proving lower bounds for kCLIQUE, as this problem is known. Our results show that, in certain mod ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
The computational problem of testing whether a graph contains a complete subgraph of size k is among the most fundamental problems studied in theoretical computer science. This thesis is concerned with proving lower bounds for kCLIQUE, as this problem is known. Our results show that, in certain models of computation, solving kCLIQUE in the average case requires Q(nk/4) resources (moreover, k/4 is tight). Here the models of computation are boundeddepth Boolean circuits and unboundeddepth monotone circuits, the complexity measure is the number of gates, and the input distributions are random graphs with an appropriate density of edges. Such random graphs (the wellstudied ErdosRenyi random graphs) are widely believed to be a source of computationally hard instances for clique problems (as Karp suggested in 1976). Our results are the first unconditional lower bounds supporting this hypothesis. For boundeddepth Boolean circuits, our averagecase hardness result significantly improves the previous worstcase lower bounds of Q(nk/Poly(d)) for depthd circuits. In particular, our lower bound of Q(nk / 4) has no noticeable dependence on d for circuits of depth
The union of minimal hitting sets: Parameterized combinatorial bounds and counting
 24th Symposium on Theoretical Aspects of Computer Science STACS 2007, LNCS 4393
"... A khitting set in a hypergraph is a set of at most k vertices that intersects all hyperedges. We study the union of all inclusionminimal khitting sets in hypergraphs of rank r (where the rank is the maximum size of hyperedges). We show that this union is relevant for certain combinatorial inferen ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
A khitting set in a hypergraph is a set of at most k vertices that intersects all hyperedges. We study the union of all inclusionminimal khitting sets in hypergraphs of rank r (where the rank is the maximum size of hyperedges). We show that this union is relevant for certain combinatorial inference problems and give worstcase bounds on its size, depending on r and k. For r = 2 our result is tight, and for each r ≥ 3 we have an asymptotically optimal bound and make progress regarding the constant factor. The exact worstcase size for r ≥ 3 remains an open problem. We also propose an algorithm for counting all khitting sets in hypergraphs of rank r. Its asymptotic runtime matches the best one known for the much more special problem of finding one khitting set. The results are used for efficient counting of khitting sets that contain any particular vertex.
SubExponential and FPTtime inapproximability of independent set and related problems
 In IPEC
, 2013
"... ar ..."
(Show Context)
SumofSquares Proofs and the Quest toward Optimal Algorithms
"... Abstract. In order to obtain the bestknown guarantees, algorithms are traditionally tailored to the particular problem we want to solve. Two recent developments, the Unique Games Conjecture (UGC) and the SumofSquares (SOS) method, surprisingly suggest that this tailoring is not necessary and that ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. In order to obtain the bestknown guarantees, algorithms are traditionally tailored to the particular problem we want to solve. Two recent developments, the Unique Games Conjecture (UGC) and the SumofSquares (SOS) method, surprisingly suggest that this tailoring is not necessary and that a single efficient algorithm could achieve best possible guarantees for a wide range of different problems. The Unique Games Conjecture (UGC) is a tantalizing conjecture in computational complexity, which, if true, will shed light on the complexity of a great many problems. In particular this conjecture predicts that a single concrete algorithm provides optimal guarantees among all efficient algorithms for a large class of computational problems. The SumofSquares (SOS) method is a general approach for solving systems of polynomial constraints. This approach is studied in several scientific disciplines, including real algebraic geometry, proof complexity, control theory, and mathematical programming, and has found applications in fields as diverse as quantum information theory, formal verification, game theory and many others. We survey some connections that were recently uncovered between the Unique Games Conjecture and the SumofSquares method. In particular, we discuss new tools to rigorously bound the running time of the SOS method for obtaining approximate solutions to hard optimization problems, and how these tools give the potential for the sumofsquares method to provide new guarantees for many problems of interest, and possibly to even refute the UGC.
Backdoors to acyclic SAT
 in Proceedings of the 39th International Colloquium on Automata, Languages, and Programming (ICALP
"... Backdoor sets, a notion introduced by Williams et al. in 2003, are certain sets of key variables of a CNF formula F that make it easy to solve the formula; by assigning truth values to the variables in a backdoor set, the formula gets reduced to one or several polynomialtime solvable formulas. More ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Backdoor sets, a notion introduced by Williams et al. in 2003, are certain sets of key variables of a CNF formula F that make it easy to solve the formula; by assigning truth values to the variables in a backdoor set, the formula gets reduced to one or several polynomialtime solvable formulas. More specifically, a weak backdoor set of F is a set X of variables such that there exits a truth assignment τ to X that reduces F to a satisfiable formula F [τ] that belongs to a polynomialtime decidable base class C. A strong backdoor set is a set X of variables such that for all assignments τ to X, the reduced formula F [τ] belongs to C. We study the problem of finding backdoor sets of size at most k with respect to the base class of CNF formulas with acyclic incidence graphs, taking k as the parameter. We show that 1. the detection of weak backdoor sets is W[2]hard in general but fixedparameter tractable for rCNF formulas, for any fixed r ≥ 3, and 2. the detection of strong backdoor sets is fixedparameter approximable. Result 1 is the the first positive one for a base class that does not have a characterization with obstructions of bounded size. Result 2 is the first positive one for a base class for which strong