Results 1  10
of
20
New methods for 3SAT decision and worstcase analysis
 THEORETICAL COMPUTER SCIENCE
, 1999
"... We prove the worstcase upper bound 1:5045 n for the time complexity of 3SAT decision, where n is the number of variables in the input formula, introducing new methods for the analysis as well as new algorithmic techniques. We add new 2 and 3clauses, called "blocked clauses", generalizing the e ..."
Abstract

Cited by 66 (12 self)
 Add to MetaCart
We prove the worstcase upper bound 1:5045 n for the time complexity of 3SAT decision, where n is the number of variables in the input formula, introducing new methods for the analysis as well as new algorithmic techniques. We add new 2 and 3clauses, called "blocked clauses", generalizing the extension rule of "Extended Resolution." Our methods for estimating the size of trees lead to a refined measure of formula complexity of 3clausesets and can be applied also to arbitrary trees. Keywords: 3SAT, worstcase upper bounds, analysis of algorithms, Extended Resolution, blocked clauses, generalized autarkness. 1 Introduction In this paper we study the exponential part of time complexity for 3SAT decision and prove the worstcase upper bound 1:5044:: n for n the number of variables in the input formula, using new algorithmic methods as well as new methods for the analysis. These methods also deepen the already existing approaches in a systematically manner. The following results...
UnitWalk: A new SAT solver that uses local search guided by unit clause elimination
, 2002
"... In this paper we present a new randomized algorithm for SAT, i.e., the satisfiability problem for Boolean formulas in conjunctive normal form. Despite its simplicity, this algorithm performs well on many common benchmarks ranging from graph coloring problems to microprocessor verification. ..."
Abstract

Cited by 63 (1 self)
 Add to MetaCart
In this paper we present a new randomized algorithm for SAT, i.e., the satisfiability problem for Boolean formulas in conjunctive normal form. Despite its simplicity, this algorithm performs well on many common benchmarks ranging from graph coloring problems to microprocessor verification.
Improved Algorithms for 3Coloring, 3EdgeColoring, and Constraint Satisfaction
, 2001
"... We consider worst case time bounds for NPcomplete problems including 3SAT, 3coloring, 3edgecoloring, and 3list coloring. Our algorithms are based on a constraint satisfaction (CSP) formulation of these problems; 3SAT is equivalent to (2, 3)CSP while the other problems above are special cases ..."
Abstract

Cited by 47 (2 self)
 Add to MetaCart
We consider worst case time bounds for NPcomplete problems including 3SAT, 3coloring, 3edgecoloring, and 3list coloring. Our algorithms are based on a constraint satisfaction (CSP) formulation of these problems; 3SAT is equivalent to (2, 3)CSP while the other problems above are special cases of (3, 2)CSP. We give a fast algorithm for (3, 2) CSP and use it to improve the time bounds for solving the other problems listed above. Our techniques involve a mixture of DavisPutnamstyle backtracking with more sophisticated matching and network flow based ideas. 1 Introduction There has recently been growing interest in analysis of superpolynomialtime algorithms, including algorithms for NPhard problems such as satisfiability or graph coloring. This interest has multiple causes: . Many important applications can be modeled with these problems, and with the increased speed of modern computers, solved effectively; for instance it is now routine to solve hard 500variable satisfia...
Deciding propositional tautologies: Algorithms and their complexity
, 1997
"... We investigate polynomial reductions and efficient branching rules for algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses. Upper bounds on the time complexity are given with exponential part 2 ff\Delta(F ) where (F ) is one of the measures n(F ) = #f variables g, ` ..."
Abstract

Cited by 38 (8 self)
 Add to MetaCart
We investigate polynomial reductions and efficient branching rules for algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses. Upper bounds on the time complexity are given with exponential part 2 ff\Delta(F ) where (F ) is one of the measures n(F ) = #f variables g, `(F ) = #f literal occurrences g and k(F ) = #f clauses g. We start with a discussion of variants of the algorithms from [Monien/Speckenmeyer85] and [Luckhardt84] with the known upper bound 2 0:695\Deltan for 3DNF and (roughly) (2 \Delta (1 \Gamma 2 \Gammap )) n for pDNF, p 3, where p is the maximal clause length, giving now an uniform treatment for all pDNF including the easy decidable case p 2. Recently for 3DNF the bound has been lowered to 2 0:5892\Deltan ([K2]; see also [Sch2], [K3]). In this article further improvements are achieved by studying two additional characteristic groups of parameters. The first group differentiates according to the maximal numbers (a; b) of occ...
New WorstCase Upper Bounds for SAT
 Journal of Automated Reasoning
, 2000
"... In 1980 Monien and Speckenmeyer proved that satisfiability of a propositional formula consisting of K clauses (of arbitrary length) can be checked in time of the order 2^{K/3}. Recently Kullmann and Luckhardt proved the worstcase upper bound 2^{L/9}, where L is the length of the input formula. The ..."
Abstract

Cited by 35 (8 self)
 Add to MetaCart
In 1980 Monien and Speckenmeyer proved that satisfiability of a propositional formula consisting of K clauses (of arbitrary length) can be checked in time of the order 2^{K/3}. Recently Kullmann and Luckhardt proved the worstcase upper bound 2^{L/9}, where L is the length of the input formula. The algorithms leading to these bounds are based on the splitting method which goes back to the Davis{Putnam procedure. Transformation rules (pure literal elimination, unit propagation etc.) constitute a substantial part of this method. In this paper we present a new transformation rule and two algorithms using this rule. We prove that these algorithms have the worstcase upper bounds 2^{0.30897K} and 2^{0.10299L}, respectively.
Worstcase Analysis, 3SAT Decision and Lower Bounds: Approaches for Improved SAT Algorithms
"... . New methods for worstcase analysis and (3)SAT decision are presented. The focus lies on the central ideas leading to the improved bound 1:5045 n for 3SAT decision ([Ku96]; n is the number of variables). The implications for SAT decision in general are discussed and elucidated by a number of h ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
. New methods for worstcase analysis and (3)SAT decision are presented. The focus lies on the central ideas leading to the improved bound 1:5045 n for 3SAT decision ([Ku96]; n is the number of variables). The implications for SAT decision in general are discussed and elucidated by a number of hypothesis'. In addition an exponential lower bound for a general class of SATalgorithms is given and the only possibilities to remain under this bound are pointed out. In this article the central ideas leading to the improved worstcase upper bound 1:5045 n for 3SAT decision ([Ku96]) are presented. 1) In nine sections the following subjects are treated: 1. "Gauging of branchings": The " function" and the concept of a "distance function" is introduced, our main tools for the analysis of SAT algorithms, and, as we propose, also a basis for (complete) practical algorithms. 2. "Estimating the size of arbitrary trees": The " Lemma" is presented, yielding an upper bound for the number of l...
Two new upper bounds for SAT
, 1998
"... In 1980 B. Monien and E. Speckenmeyer proved that satisfiability of a propositional formula consisting of K clauses can be checked in time of the order 2^{K/3}. Recently O. Kullmann and H. Luckhardt proved the bound 2^{L/9}, where L is the length of the input formula. The algorithms leading to these ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
In 1980 B. Monien and E. Speckenmeyer proved that satisfiability of a propositional formula consisting of K clauses can be checked in time of the order 2^{K/3}. Recently O. Kullmann and H. Luckhardt proved the bound 2^{L/9}, where L is the length of the input formula. The algorithms leading to these bounds (like many other SAT algorithms) are based on splitting, i.e., they reduce SAT for a formula F to SAT for several simpler formulas F1 , F2 , ... , Fm . These algorithms simplify each of F1 , F2 , ... , Fm according to some transformation rules such as the elimination of pure literals, the unit propagation rule etc. In this paper we present a new transformation rule and two algorithms using this rule. These algorithms have the bounds 2^{0.30897K} and 2^{0.10537L}, respectively.
On a generalization of Extended Resolution
 Discrete Applied Mathematics
, 1997
"... Motivated by improved SAT algorithms ([13, 14, 15]; yielding new worst case upper bounds) a natural parameterized generalization GER of Extended Resolution (ER) is introduced. ER can simulate polynomially GER, but GER allows special cases for which exponential lower bounds can be proven. 1 Introduct ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
Motivated by improved SAT algorithms ([13, 14, 15]; yielding new worst case upper bounds) a natural parameterized generalization GER of Extended Resolution (ER) is introduced. ER can simulate polynomially GER, but GER allows special cases for which exponential lower bounds can be proven. 1 Introduction Extended Resolution G. Tseitin introduced in [21] the Extension Rule for the Resolution Calculus: F \Gamma! F [ n fv; a; bg; fv; ag; fv; bg o for arbitrary variables a; b and a new variable v (new relative to the set F of premises and to a; b). Thereby the clauseset \Phi fv; a; bg; fv; ag; fv; bg \Psi is the Conjunctive Normal Form of the formula v $ (a b). An Extended Resolution Proof (for short: ER proof) of the empty clause ? from the clauseset F is an ordinary resolution proof of ? from F , where F ' F is obtained by repeated applications of the Extension Rule. The length of an ER proof is the (total) number of (different) clauses in it. We denote by Comp ER (F...
Investigating a general hierarchy of polynomially decidable classes of CNF's based on short treelike resolution proofs
, 1999
"... We investigate a hierarchy Gk (U ; S) of classes of conjunctive normal forms, recognizable and SATdecidable in polynomial time, with special emphasize on the corresponding hardness parameter hU ;S (F ) for clausesets F (the first level of inclusion). At level 0 an (incomplete, polytime) oracl ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
We investigate a hierarchy Gk (U ; S) of classes of conjunctive normal forms, recognizable and SATdecidable in polynomial time, with special emphasize on the corresponding hardness parameter hU ;S (F ) for clausesets F (the first level of inclusion). At level 0 an (incomplete, polytime) oracle U for unsatisfiability detection and an oracle S for satisfiability detection is used. The hierarchy from [Pretolani 96] is improved in this way with respect to strengthened satisfiability handling, simplified recognition and consistent relativization. Also a hierarchy of canonical polytime reductions with Unitclause propagation at the first level is obtained. General methods for upper and lower bounds on hU ;S (F ) are developed and applied to a number of wellknown examples. hU ;S (F ) admits several different characterizations, including the space complexity of treelike resolution and the use of pebble games as in [Esteban, Tor'an 99]. Using for S the class of linearly sat...
Lean clausesets: Generalizations of minimally unsatisfiable clausesets
 Discrete Applied Mathematics
, 2000
"... We study the problem of (efficiently) deleting such clauses from conjunctive normal forms (clausesets) which can not contribute to any proof of unsatisfiability. For that purpose we introduce the notion of an autarky system, associated with a canonical normal form for every clauseset by deleti ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
We study the problem of (efficiently) deleting such clauses from conjunctive normal forms (clausesets) which can not contribute to any proof of unsatisfiability. For that purpose we introduce the notion of an autarky system, associated with a canonical normal form for every clauseset by deleting superfluous clauses. Clausesets where no clauses can be deleted are called lean, a natural generalization of minimally unsatisfiable clausesets, opening the possibility for combinatorial approaches (and including also satisfiable instances). Three special examples for autarky systems are considered: general autarkies, linear autarkies (based on linear programming) and matching autarkies (based on matching theory). We give new characterizations of lean and linearly lean clausesets by "universal linear programming problems," while matching lean clausesets are characterized in terms of "deficiency, " the difference between the number of clauses and the number of variables, and ...