Results 1  10
of
75
Bucket Elimination: A Unifying Framework for Probabilistic Inference
, 1996
"... . Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing ..."
Abstract

Cited by 289 (31 self)
 Add to MetaCart
. Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing in the probabilistic inference literature and clarifies the relationship of such algorithms to nonserial dynamic programming algorithms. A general method for combining conditioning and bucket elimination is also presented. For all the algorithms, bounds on complexity are given as a function of the problem's structure. 1. Overview Bucket elimination is a unifying algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problemsolving and reasoning activities, including directional resolution for propositional satisfiability (Davis and Putnam, 1960), adaptive consistency for constraint satisfaction (Dechter and Pearl, 1987), Fourier and Gaussian el...
Bucket Elimination: A Unifying Framework for Reasoning
"... Bucket elimination is an algorithmic framework that generalizes dynamic programming to accommodate many problemsolving and reasoning tasks. Algorithms such as directionalresolution for propositional satisfiability, adaptiveconsistency for constraint satisfaction, Fourier and Gaussian elimination ..."
Abstract

Cited by 271 (60 self)
 Add to MetaCart
Bucket elimination is an algorithmic framework that generalizes dynamic programming to accommodate many problemsolving and reasoning tasks. Algorithms such as directionalresolution for propositional satisfiability, adaptiveconsistency for constraint satisfaction, Fourier and Gaussian elimination for solving linear equalities and inequalities, and dynamic programming for combinatorial optimization, can all be accommodated within the bucket elimination framework. Many probabilistic inference tasks can likewise be expressed as bucketelimination algorithms. These include: belief updating, finding the most probable explanation, and expected utility maximization. These algorithms share the same performance guarantees; all are time and space exponential in the inducedwidth of the problem's interaction graph. While elimination strategies have extensive demands on memory, a contrasting class of algorithms called "conditioning search" require only linear space. Algorithms in this class split a problem into subproblems by instantiating a subset of variables, called a conditioning set, or a cutset. Typical examples of conditioning search algorithms are: backtracking (in constraint satisfaction), and branch and bound (for combinatorial optimization). The paper presents the bucketelimination framework as a unifying theme across probabilistic and deterministic reasoning tasks and show how conditioning search can be augmented to systematically trade space for time.
Improvements To Propositional Satisfiability Search Algorithms
, 1995
"... ... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable ..."
Abstract

Cited by 160 (0 self)
 Add to MetaCart
... quickly across a wide range of hard SAT problems than any other SAT tester in the literature on comparable platforms. On a Sun SPARCStation 10 running SunOS 4.1.3 U1, POSIT can solve hard random 400variable 3SAT problems in about 2 hours on the average. In general, it can solve hard nvariable random 3SAT problems with search trees of size O(2 n=18:7 ). In addition to justifying these claims, this dissertation describes the most significant achievements of other researchers in this area, and discusses all of the widely known general techniques for speeding up SAT search algorithms. It should be useful to anyone interested in NPcomplete problems or combinatorial optimization in general, and it should be particularly useful to researchers in either Artificial Intelligence or Operations Research.
A Knowledge Compilation Map
 Journal of Artificial Intelligence Research
, 2002
"... We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. ..."
Abstract

Cited by 159 (22 self)
 Add to MetaCart
We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime.
Algorithms for the Satisfiability (SAT) Problem: A Survey
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1996
"... . The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, compute ..."
Abstract

Cited by 124 (3 self)
 Add to MetaCart
. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computeraided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applications of the sat...
Decomposable negation normal form
 Journal of the ACM
, 2001
"... Abstract. Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online quer ..."
Abstract

Cited by 109 (18 self)
 Add to MetaCart
Abstract. Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online queryanswering phase. In the offline phase, the propositional theory is compiled into some target language, which is typically a tractable one. In the online phase, the compiled target is used to efficiently answer a (potentially) exponential number of queries. The main motivation behind knowledge compilation is to push as much of the computational overhead as possible into the offline phase, in order to amortize that overhead over all online queries. Another motivation behind compilation is to produce very simple online reasoning systems, which can be embedded costeffectively into primitive computational platforms, such as those found in consumer electronics. One of the key aspects of any compilation approach is the target language into which the propositional theory is compiled. Previous target languages included Horn theories, prime implicates/implicants and ordered binary decision diagrams (OBDDs). We propose in this paper a new target compilation language, known as decomposable negation normal form (DNNF), and present a number of its properties that make it of interest to the broad community. Specifically, we
A Survey on Knowledge Compilation
, 1998
"... this paper we survey recent results in knowledge compilation of propositional knowledge bases. We first define and limit the scope of such a technique, then we survey exact and approximate knowledge compilation methods. We include a discussion of compilation for nonmonotonic knowledge bases. Keywor ..."
Abstract

Cited by 96 (3 self)
 Add to MetaCart
this paper we survey recent results in knowledge compilation of propositional knowledge bases. We first define and limit the scope of such a technique, then we survey exact and approximate knowledge compilation methods. We include a discussion of compilation for nonmonotonic knowledge bases. Keywords: Knowledge Representation, Efficiency of Reasoning
SATzilla: Portfoliobased Algorithm Selection for SAT
"... It has been widely observed that there is no single “dominant ” SAT solver; instead, different solvers perform best on different instances. Rather than following the traditional approach of choosing the best solver for a given class of instances, we advocate making this decision online on a perinst ..."
Abstract

Cited by 91 (16 self)
 Add to MetaCart
It has been widely observed that there is no single “dominant ” SAT solver; instead, different solvers perform best on different instances. Rather than following the traditional approach of choosing the best solver for a given class of instances, we advocate making this decision online on a perinstance basis. Building on previous work, we describe SATzilla, an automated approach for constructing perinstance algorithm portfolios for SAT that use socalled empirical hardness models to choose among their constituent solvers. This approach takes as input a distribution of problem instances and a set of component solvers, and constructs a portfolio optimizing a given objective function (such as mean runtime, percent of instances solved, or score in a competition). The excellent performance of our SATzilla portfolios has been independently verified in the 2007 SAT Competition, where our SATzilla07 solvers won three gold, one silver and one bronze medal. In this article, we go well beyond SATzilla07 by making the portfolio construction scalable and completely automated, and improving it by integrating local search solvers as candidate solvers, by predicting performance score instead of runtime, and by using hierarchical hardness models that take into account different types of SAT instances. We demonstrate the effectiveness of these new techniques in extensive experimental results on data sets including instances from the most recent SAT competition. 1.
A Scheme for Approximating Probabilistic Inference
 In Proceedings of Uncertainty in Artificial Intelligence (UAI97
, 1997
"... This paper describes a class of probabilistic approximation algorithms based on bucket elimination which offer adjustable levels of accuracy and efficiency. We analyze the approximation for several tasks: finding the most probable explanation, belief updating and finding the maximum a posteriori hyp ..."
Abstract

Cited by 62 (24 self)
 Add to MetaCart
This paper describes a class of probabilistic approximation algorithms based on bucket elimination which offer adjustable levels of accuracy and efficiency. We analyze the approximation for several tasks: finding the most probable explanation, belief updating and finding the maximum a posteriori hypothesis. We identify regions of completeness and provide preliminary empirical evaluation on randomly generated networks. 1 Overview Bucket elimination, is a unifying algorithmic framework that generalizes dynamic programming to enable many complex problemsolving and reasoning activities. Among the algorithms that can be accommodated within this framework are directional resolution for propositional satisfiability, adaptive consistency for constraint satisfaction, Fourier and Gaussian elimination for linear equalities and inequalities, and dynamic programming for combinatorial optimization [ 7 ] . Many algorithms for probabilistic inference, such as belief updating, finding the most proba...
Local and global relational consistency
 THEORETICAL COMPUTER SCIENCE
, 1997
"... Local consistency has proven to be an important concept in the theory and practice of constraint networks. In this paper, we present a new definition of local consistency, called relational consistency. The new definition is relationbased, in contrast with the previous definition of local consiste ..."
Abstract

Cited by 61 (13 self)
 Add to MetaCart
Local consistency has proven to be an important concept in the theory and practice of constraint networks. In this paper, we present a new definition of local consistency, called relational consistency. The new definition is relationbased, in contrast with the previous definition of local consistency, which we characterize as variablebased. We show the conceptual power of the new definition by showing how it unifies known elimination operators such as resolution in theorem proving, joins in relational databases, and variable elimination for solving linear inequalities. Algorithms for enforcing various levels of relational consistency are introduced and analyzed. We also show the usefulness of the new definition in characterizing relationships between properties of constraint networks and the level of local consistency needed to ensure global consistency.