Results 1  10
of
33
Planning as satisfiability
 IN ECAI92
, 1992
"... We develop a formal model of planning based on satisfiability rather than deduction. The satis ability approach not only provides a more flexible framework for stating di erent kinds of constraints on plans, but also more accurately reflects the theory behind modern constraintbased planning systems ..."
Abstract

Cited by 431 (26 self)
 Add to MetaCart
We develop a formal model of planning based on satisfiability rather than deduction. The satis ability approach not only provides a more flexible framework for stating di erent kinds of constraints on plans, but also more accurately reflects the theory behind modern constraintbased planning systems. Finally, we consider the computational characteristics of the resulting formulas, by solving them with two very different satisfiability testing procedures.
Noise strategies for improving local search
 In Proceedings of the Eleventh National Conference on Artificial Intelligence (AAAI94
, 1994
"... It has recently been shown that local search issurprisingly good at nding satisfying assignments for certain computationally hard classes of CNF formulas. The performance of basic local search methods can be further enhanced by introducing mechanisms for escaping from local minima in the search spac ..."
Abstract

Cited by 360 (8 self)
 Add to MetaCart
It has recently been shown that local search issurprisingly good at nding satisfying assignments for certain computationally hard classes of CNF formulas. The performance of basic local search methods can be further enhanced by introducing mechanisms for escaping from local minima in the search space. We will compare three such mechanisms: simulated annealing, random noise, and a strategy called \mixed random walk". We show that mixed random walk is the superior strategy. Wealso present results demonstrating the e ectiveness of local search withwalk for solving circuit synthesis and circuit diagnosis problems. Finally, wedemonstrate that mixed random walk improves upon the best known methods for solving MAXSAT problems.
Bucket Elimination: A Unifying Framework for Probabilistic Inference
, 1996
"... . Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing ..."
Abstract

Cited by 293 (34 self)
 Add to MetaCart
. Probabilistic inference algorithms for belief updating, finding the most probable explanation, the maximum a posteriori hypothesis, and the maximum expected utility are reformulated within the bucket elimination framework. This emphasizes the principles common to many of the algorithms appearing in the probabilistic inference literature and clarifies the relationship of such algorithms to nonserial dynamic programming algorithms. A general method for combining conditioning and bucket elimination is also presented. For all the algorithms, bounds on complexity are given as a function of the problem's structure. 1. Overview Bucket elimination is a unifying algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problemsolving and reasoning activities, including directional resolution for propositional satisfiability (Davis and Putnam, 1960), adaptive consistency for constraint satisfaction (Dechter and Pearl, 1987), Fourier and Gaussian el...
Bucket Elimination: A Unifying Framework for Reasoning
"... Bucket elimination is an algorithmic framework that generalizes dynamic programming to accommodate many problemsolving and reasoning tasks. Algorithms such as directionalresolution for propositional satisfiability, adaptiveconsistency for constraint satisfaction, Fourier and Gaussian elimination ..."
Abstract

Cited by 278 (62 self)
 Add to MetaCart
Bucket elimination is an algorithmic framework that generalizes dynamic programming to accommodate many problemsolving and reasoning tasks. Algorithms such as directionalresolution for propositional satisfiability, adaptiveconsistency for constraint satisfaction, Fourier and Gaussian elimination for solving linear equalities and inequalities, and dynamic programming for combinatorial optimization, can all be accommodated within the bucket elimination framework. Many probabilistic inference tasks can likewise be expressed as bucketelimination algorithms. These include: belief updating, finding the most probable explanation, and expected utility maximization. These algorithms share the same performance guarantees; all are time and space exponential in the inducedwidth of the problem's interaction graph. While elimination strategies have extensive demands on memory, a contrasting class of algorithms called "conditioning search" require only linear space. Algorithms in this class split a problem into subproblems by instantiating a subset of variables, called a conditioning set, or a cutset. Typical examples of conditioning search algorithms are: backtracking (in constraint satisfaction), and branch and bound (for combinatorial optimization). The paper presents the bucketelimination framework as a unifying theme across probabilistic and deterministic reasoning tasks and show how conditioning search can be augmented to systematically trade space for time.
Towards an understanding of hillclimbing procedures for SAT
 In Proceedings of AAAI93
, 1993
"... Recently several local hillclimbing procedures for propositional satisability havebeen proposed, which are able to solve large and di cult problems beyond the reach ofconventional algorithms like DavisPutnam. By the introduction of some new variants of these procedures, we provide strong experimen ..."
Abstract

Cited by 137 (6 self)
 Add to MetaCart
Recently several local hillclimbing procedures for propositional satisability havebeen proposed, which are able to solve large and di cult problems beyond the reach ofconventional algorithms like DavisPutnam. By the introduction of some new variants of these procedures, we provide strong experimental evidence to support the conjecture that neither greediness nor randomness is important in these procedures. One of the variants introduced seems to o er signi cant improvements over earlier procedures. In addition, we investigate experimentally how their performance depends on their parameters. Our results suggest that runtime scales less than simply exponentially in the problem size. 1
The Rhetorical Parsing, Summarization, and Generation of Natural Language Texts
, 1997
"... This thesis is an inquiry into the nature of the highlevel, rhetorical structure of unrestricted natural language texts, computational means to enable its derivation, and two applications (in automatic summarization and natural language generation) that follow from the ability to build such structu ..."
Abstract

Cited by 108 (9 self)
 Add to MetaCart
This thesis is an inquiry into the nature of the highlevel, rhetorical structure of unrestricted natural language texts, computational means to enable its derivation, and two applications (in automatic summarization and natural language generation) that follow from the ability to build such structures automatically. The thesis proposes a firstorder formalization of the highlevel, rhetorical structure of text. The formalization assumes that text can be sequenced into elementary units; that discourse relations hold between textual units of various sizes; that some textual units are more important to the writer's purpose than others; and that trees are a good approximation of the abstract structure of text. The formalization also introduces a linguistically motivated compositionality criterion, which is shown to hold for the text structures that are valid. The thesis proposes, analyzes theoretically, and compares empirically four algorithms for determining the valid text structures of ...
Directional Resolution: The DavisPutnam Procedure, Revisited
 IN PROCEEDINGS OF KR94
, 1994
"... The paper presents an algorithm called directional resolution, a variation on the original DavisPutnam algorithm, and analyzes its worstcase behavior as a function of the topological structure of propositional theories. The concepts of induced width and diversity are shown to play a key role in ..."
Abstract

Cited by 101 (22 self)
 Add to MetaCart
The paper presents an algorithm called directional resolution, a variation on the original DavisPutnam algorithm, and analyzes its worstcase behavior as a function of the topological structure of propositional theories. The concepts of induced width and diversity are shown to play a key role in bounding the complexity of the procedure. The importance of our analysis lies in highlighting structurebased tractable classes of satisfiability and in providing theoretical guarantees on the time and space complexity of the algorithm. Contrary to previous assessments, we show that for many theories directional resolution could be an effective procedure. Our empirical tests confirm theoretical prediction, showing that on problems with a special structure, namely ktree embeddings (e.g. chains, (k,m)trees), directional resolution greatly outperforms one of the most effective satisfiability algorithms known to date, the popular DavisPutnam procedure. Furthermore, combining a bounded...
A rearrangement search strategy for determining propositional satisfiability
 in Proceedings of the National Conference on Artificial Intelligence
, 1988
"... We present a simple algorithm for determining the satis ability of propositional formulas in Conjunctive Normal Form. As the procedure searches for a satisfying truth assignment it dynamically rearranges the order in which variables are considered. The choice of which variable to assign a truth valu ..."
Abstract

Cited by 74 (1 self)
 Add to MetaCart
We present a simple algorithm for determining the satis ability of propositional formulas in Conjunctive Normal Form. As the procedure searches for a satisfying truth assignment it dynamically rearranges the order in which variables are considered. The choice of which variable to assign a truth value next is guided by an upper bound on the size of the search remaining � the procedure makes the choice which yields the smallest upper bound on the size of the remaining search. We describe several upper bound functions and discuss the tradeo between accurate upper bound functions and the overhead required to compute the upper bounds. Experimental data shows that for one easily computed upper bound the reduction in the size of the search space more than compensates for the overhead involved in selecting the next variable. 1
Local and global relational consistency
 THEORETICAL COMPUTER SCIENCE
, 1997
"... Local consistency has proven to be an important concept in the theory and practice of constraint networks. In this paper, we present a new definition of local consistency, called relational consistency. The new definition is relationbased, in contrast with the previous definition of local consiste ..."
Abstract

Cited by 61 (15 self)
 Add to MetaCart
Local consistency has proven to be an important concept in the theory and practice of constraint networks. In this paper, we present a new definition of local consistency, called relational consistency. The new definition is relationbased, in contrast with the previous definition of local consistency, which we characterize as variablebased. We show the conceptual power of the new definition by showing how it unifies known elimination operators such as resolution in theorem proving, joins in relational databases, and variable elimination for solving linear inequalities. Algorithms for enforcing various levels of relational consistency are introduced and analyzed. We also show the usefulness of the new definition in characterizing relationships between properties of constraint networks and the level of local consistency needed to ensure global consistency.
SATbased Procedures for Temporal Reasoning
, 1999
"... In this paper we study the consistency problem for a set of disjunctive temporal constraints [Stergiou and Koubarakis, 1998]. We propose two SATbased procedures, and show thaton sets of binary randomly generated disjunctive constraintsthey perform up to 2 orders of magnitude less consistency ..."
Abstract

Cited by 53 (6 self)
 Add to MetaCart
In this paper we study the consistency problem for a set of disjunctive temporal constraints [Stergiou and Koubarakis, 1998]. We propose two SATbased procedures, and show thaton sets of binary randomly generated disjunctive constraintsthey perform up to 2 orders of magnitude less consistency checks than the best procedure presented in [Stergiou and Koubarakis, 1998]. On these tests, our experimental analysis conrms Stergiou and Koubarakis's result about the existence of an easyhardeasy pattern whose peak corresponds to a value in between 6 and 7 of the ratio of clauses to variables.