Results 1  10
of
16
Improvements To Propositional Satisfiability Search Algorithms
, 1995
"... ... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable ..."
Abstract

Cited by 160 (0 self)
 Add to MetaCart
... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable random 3SAT problems with search trees of size O(2 n=18:7 ). In addition to justifying these claims, this dissertation describes the most significant achievements of other researchers in this area, and discusses all of the widely known general techniques for speeding up SAT search algorithms. It should be useful to anyone interested in NPcomplete problems or combinatorial optimization in general, and it should be particularly useful to researchers in either Artificial Intelligence or Operations Research.
Algorithms for the Satisfiability (SAT) Problem: A Survey
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1996
"... . The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, compute ..."
Abstract

Cited by 125 (3 self)
 Add to MetaCart
. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applications of the sat...
New methods for 3SAT decision and worstcase analysis
 THEORETICAL COMPUTER SCIENCE
, 1999
"... We prove the worstcase upper bound 1:5045 n for the time complexity of 3SAT decision, where n is the number of variables in the input formula, introducing new methods for the analysis as well as new algorithmic techniques. We add new 2 and 3clauses, called "blocked clauses", generalizing the e ..."
Abstract

Cited by 64 (12 self)
 Add to MetaCart
We prove the worstcase upper bound 1:5045 n for the time complexity of 3SAT decision, where n is the number of variables in the input formula, introducing new methods for the analysis as well as new algorithmic techniques. We add new 2 and 3clauses, called "blocked clauses", generalizing the extension rule of "Extended Resolution." Our methods for estimating the size of trees lead to a refined measure of formula complexity of 3clausesets and can be applied also to arbitrary trees. Keywords: 3SAT, worstcase upper bounds, analysis of algorithms, Extended Resolution, blocked clauses, generalized autarkness. 1 Introduction In this paper we study the exponential part of time complexity for 3SAT decision and prove the worstcase upper bound 1:5044:: n for n the number of variables in the input formula, using new algorithmic methods as well as new methods for the analysis. These methods also deepen the already existing approaches in a systematically manner. The following results...
Deciding propositional tautologies: Algorithms and their complexity
, 1997
"... We investigate polynomial reductions and efficient branching rules for algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses. Upper bounds on the time complexity are given with exponential part 2 ff\Delta(F ) where (F ) is one of the measures n(F ) = #f variables g, ` ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
We investigate polynomial reductions and efficient branching rules for algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses. Upper bounds on the time complexity are given with exponential part 2 ff\Delta(F ) where (F ) is one of the measures n(F ) = #f variables g, `(F ) = #f literal occurrences g and k(F ) = #f clauses g. We start with a discussion of variants of the algorithms from [Monien/Speckenmeyer85] and [Luckhardt84] with the known upper bound 2 0:695\Deltan for 3DNF and (roughly) (2 \Delta (1 \Gamma 2 \Gammap )) n for pDNF, p 3, where p is the maximal clause length, giving now an uniform treatment for all pDNF including the easy decidable case p 2. Recently for 3DNF the bound has been lowered to 2 0:5892\Deltan ([K2]; see also [Sch2], [K3]). In this article further improvements are achieved by studying two additional characteristic groups of parameters. The first group differentiates according to the maximal numbers (a; b) of occ...
Worstcase Analysis, 3SAT Decision and Lower Bounds: Approaches for Improved SAT Algorithms
"... . New methods for worstcase analysis and (3)SAT decision are presented. The focus lies on the central ideas leading to the improved bound 1:5045 n for 3SAT decision ([Ku96]; n is the number of variables). The implications for SAT decision in general are discussed and elucidated by a number of h ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
. New methods for worstcase analysis and (3)SAT decision are presented. The focus lies on the central ideas leading to the improved bound 1:5045 n for 3SAT decision ([Ku96]; n is the number of variables). The implications for SAT decision in general are discussed and elucidated by a number of hypothesis'. In addition an exponential lower bound for a general class of SATalgorithms is given and the only possibilities to remain under this bound are pointed out. In this article the central ideas leading to the improved worstcase upper bound 1:5045 n for 3SAT decision ([Ku96]) are presented. 1) In nine sections the following subjects are treated: 1. "Gauging of branchings": The " function" and the concept of a "distance function" is introduced, our main tools for the analysis of SAT algorithms, and, as we propose, also a basis for (complete) practical algorithms. 2. "Estimating the size of arbitrary trees": The " Lemma" is presented, yielding an upper bound for the number of l...
On a generalization of extended resolution
 DISCRETE APPLIED MATHEMATICS 96–97 (1999) 149–176
, 1998
"... ... Inform. Comput., submitted); yielding new worstcase upper bounds) a natural parameterized generalization GER of Extended Resolution (ER) is introduced. ER can simulate polynomially GER, but GER allows special cases for which exponential lower bounds can be proven. ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
... Inform. Comput., submitted); yielding new worstcase upper bounds) a natural parameterized generalization GER of Extended Resolution (ER) is introduced. ER can simulate polynomially GER, but GER allows special cases for which exponential lower bounds can be proven.
Blocked Clause Elimination
"... Abstract. Boolean satisfiability (SAT) and its extensions are becoming a core technology for the analysis of systems. The SATbased approach divides into three steps: encoding, preprocessing, and search. It is often argued that by encoding arbitrary Boolean formulas in conjunctive normal form (CNF), ..."
Abstract

Cited by 16 (10 self)
 Add to MetaCart
Abstract. Boolean satisfiability (SAT) and its extensions are becoming a core technology for the analysis of systems. The SATbased approach divides into three steps: encoding, preprocessing, and search. It is often argued that by encoding arbitrary Boolean formulas in conjunctive normal form (CNF), structural properties of the original problem are not reflected in the CNF. This should result in the fact that CNFlevel preprocessing and SAT solver techniques have an inherent disadvantagecompared to related techniques applicable on the level of more structural SAT instance representations such as Boolean circuits. In this work we study the effect of a CNFlevel simplification technique called blocked clause elimination (BCE). We show that BCE is surprisingly effective both in theory and in practice on CNFs resulting from a standard CNF encoding for circuits: without explicit knowledge of the underlying circuit structure, it achieves the same level of simplification as a combination of circuitlevel simplifications and previously suggested polaritybased CNF encodings. Experimentally, we show that by applying BCE in preprocessing, further formula reduction and faster solving can be achieved, giving promise for applying BCE to speed up solvers. 1
Algorithms for SAT/TAUT decision based on various measures
 Information and Computation
, 1999
"... We investigate algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses given by restrictions on the number of occurrences of literals. Especially polynomial use of resolution for reductions in combination with a new combinatorial principle called "Generalized Sign Princip ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
We investigate algorithms deciding propositional tautologies for DNF and coNPcomplete subclasses given by restrictions on the number of occurrences of literals. Especially polynomial use of resolution for reductions in combination with a new combinatorial principle called "Generalized Sign Principle" is studied. Upper bounds on time complexity are given with exponential part 2 ff\Delta(F ) where the measure (F ) for a clause set F either is the number n(F ) of variables, the number `(F ) of literal occurrences or the number k(F ) of clauses. ff is called a "power coefficient" for the class of formulas under consideration w.r.t. measure . Power coefficients are derived with the help of a method estimating the size of trees, which is also used to find "good" branching rules. Under natural conditions power coefficients ff; fi; fl for n; k; ` respectively fulfill ff fi fl. We obtain the following power coefficients.  0:1112 for DNF w.r.t. `  0:3334 for DNF w.r.t. k These result...
Positive Unit Hyperresolution Tableaux and Their Application to Minimal Model Generation
 Journal of Automated Reasoning
, 2000
"... . Minimal Herbrand models of sets of firstorder clauses are useful in several areas of computer science, e.g. automated theorem proving, program verification, logic programming, databases, and artificial intelligence. In most cases, the conventional model generation algorithms are inappropriate bec ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
. Minimal Herbrand models of sets of firstorder clauses are useful in several areas of computer science, e.g. automated theorem proving, program verification, logic programming, databases, and artificial intelligence. In most cases, the conventional model generation algorithms are inappropriate because they generate nonminimal Herbrand models and can be inefficient. This article describes an approach for generating the minimal Herbrand models of sets of firstorder clauses. The approach builds upon positive unit hyperresolution (PUHR) tableaux, that are in general smaller than conventional tableaux. PUHR tableaux formalize the approach initially introduced with the theorem prover SATCHMO. Two minimal model generation procedures are described. The first one expands PUHR tableaux depthfirst relying on a complement splitting expansion rule and on a form of backtracking involving constraints. A Prolog implementation, named MMSATCHMO, of this procedure is given and its performance on ben...
Enhancing Maximum Satisfiability Algorithms with Pure Literal Strategies
 In 11th Canadian Conference on Artificial Intelligence, AI'96
, 1996
"... . Maximum satisfiability (MAXSAT) is an extension of satisfiability (SAT), in which a partial solution is sought that satisfies the maximum number of clauses in a logical formula. In recent years there has been an growing interest in this and other types of overconstrained problems. Branch and bou ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. Maximum satisfiability (MAXSAT) is an extension of satisfiability (SAT), in which a partial solution is sought that satisfies the maximum number of clauses in a logical formula. In recent years there has been an growing interest in this and other types of overconstrained problems. Branch and bound extensions of the DavisPutnam algorithm can return guaranteed optimal solutions to these problems. Earlier work did not make use of a pure literal rule because it appeared to be inefficient here, as for traditional SAT. However, arguments can be adduced to show that pure literals are likely to appear during search for MAX2SAT, so that fixation of their variables may be effective here. The present work confirms this and also shows that a value ordering heuristic involving literals that are monotone in unit open clauses can be very effective, operating somewhat independently of the ordinary fixation of fully monotone literals. Alone or together, these pure literal strategies can produce i...