Results 1  10
of
64
A Knowledge Compilation Map
 Journal of Artificial Intelligence Research
, 2002
"... We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. ..."
Abstract

Cited by 162 (24 self)
 Add to MetaCart
We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime.
Decomposable negation normal form
 Journal of the ACM
, 2001
"... Abstract. Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online quer ..."
Abstract

Cited by 111 (19 self)
 Add to MetaCart
Abstract. Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online queryanswering phase. In the offline phase, the propositional theory is compiled into some target language, which is typically a tractable one. In the online phase, the compiled target is used to efficiently answer a (potentially) exponential number of queries. The main motivation behind knowledge compilation is to push as much of the computational overhead as possible into the offline phase, in order to amortize that overhead over all online queries. Another motivation behind compilation is to produce very simple online reasoning systems, which can be embedded costeffectively into primitive computational platforms, such as those found in consumer electronics. One of the key aspects of any compilation approach is the target language into which the propositional theory is compiled. Previous target languages included Horn theories, prime implicates/implicants and ordered binary decision diagrams (OBDDs). We propose in this paper a new target compilation language, known as decomposable negation normal form (DNNF), and present a number of its properties that make it of interest to the broad community. Specifically, we
On the compilability and expressive power of propositional planning formalisms
, 1998
"... The recent approaches of extending the GRAPHPLAN algorithm to handle more expressive planning formalisms raise the question of what the formal meaning of “expressive power ” is. We formalize the intuition that expressive power is a measure of how concisely planning domains and plans can be expressed ..."
Abstract

Cited by 66 (10 self)
 Add to MetaCart
The recent approaches of extending the GRAPHPLAN algorithm to handle more expressive planning formalisms raise the question of what the formal meaning of “expressive power ” is. We formalize the intuition that expressive power is a measure of how concisely planning domains and plans can be expressed in a particular formalism by introducing the notion of “compilation schemes ” between planning formalisms. Using this notion, we analyze the expressiveness of a large family of propositional planning formalisms, ranging from basic STRIPS to a formalism with conditional effects, partial state specifications, and propositional formulae in the preconditions. One of the results is that conditional effects cannot be compiled away if plan size should grow only linearly but can be compiled away if we allow for polynomial growth of the resulting plans. This result confirms that the recently proposed extensions to the GRAPHPLAN algorithm concerning conditional effects are optimal with respect to the “compilability ” framework. Another result is that general propositional formulae cannot be compiled into conditional effects if the plan size should be preserved linearly. This implies that allowing general propositional formulae in preconditions and effect conditions adds another level of difficulty in generating a plan.
Constraint propagation
 Handbook of Constraint Programming
, 2006
"... Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent ..."
Abstract

Cited by 51 (3 self)
 Add to MetaCart
Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent
On the Tractable Counting of Theory Models and its Application to Truth Maintenance and Belief Revision
 Journal of Applied NonClassical Logics
, 2000
"... We address the problem of counting the models of a propositional theory, under incremental changes to the theory. Specifically, we show that if a propositional theory is in a special form that we call smooth, deterministic, decomposable negation normal form (sdDNNF), then for any consistent set of ..."
Abstract

Cited by 50 (17 self)
 Add to MetaCart
We address the problem of counting the models of a propositional theory, under incremental changes to the theory. Specifically, we show that if a propositional theory is in a special form that we call smooth, deterministic, decomposable negation normal form (sdDNNF), then for any consistent set of literals S, we can simultaneously count, in time linear in the size of , the models of: [ S; [ S [ flg: for every literal l 62 S; [ S n flg: for every literal l 2 S; [ S n flg [ f:lg: for every literal l 2 S.
Integrity and Change in Modular Ontologies
, 2003
"... The benefits of modular representations are well known from many areas of computer science. In this paper, we concentrate on the benefits of modular ontologies with respect to local containment of terminological reasoning. We define an architecture for modular ontologies that supports local re ..."
Abstract

Cited by 41 (13 self)
 Add to MetaCart
The benefits of modular representations are well known from many areas of computer science. In this paper, we concentrate on the benefits of modular ontologies with respect to local containment of terminological reasoning. We define an architecture for modular ontologies that supports local reasoning by compiling implied subsumption relations.
Principles and Applications of Continual Computation
 Artificial Intelligence
, 2001
"... Automated problem solving is viewed typically as the allocation of computational resources to solve one or more problems passed to a reasoning system. In response to each problem received, effort is applied in real time to generate a solution and problem solving ends when a solution is rendered. We ..."
Abstract

Cited by 40 (7 self)
 Add to MetaCart
Automated problem solving is viewed typically as the allocation of computational resources to solve one or more problems passed to a reasoning system. In response to each problem received, effort is applied in real time to generate a solution and problem solving ends when a solution is rendered. We examine continual computation, reasoning policies that capture a broader conception of problem by considering the proactive allocation of computational resources to potential future challenges. We explore policies for allocating idle time for several settings and present applications that highlight opportunities for harnessing continual computation in realworld tasks. 2001 Elsevier Science B.V. All rights reserved. Keywords: Bounded rationality; Decisiontheoretic control; Metareasoning; Deliberation; Compilation; Speculative execution; Value of computation 1.
Preprocessing of Intractable Problems
 Information and Computation
, 1997
"... Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as ..."
Abstract

Cited by 32 (10 self)
 Add to MetaCart
Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as to simplify the remaining online problem. In this paper we investigate such a technique in the context of intractable, i.e., NPhard, problems. Recent results in the literature show that not all NPhard problems behave in the same way: for some of them preprocessing yields polynomialtime online simplified problems (we call them compilable), while for other ones there is strong evidence that this should not happen. Our primary goal is to provide a sound methodology that can be used either to prove or disprove that a problem is compilable. To this end, we define new models of computation, complexity classes, and reductions. We find complete problems for such classes, completeness meaning...
A Perspective on Knowledge Compilation
 In Proc. International Joint Conference on Artificial Intelligence (IJCAI
, 2001
"... We provide a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. We argue that ..."
Abstract

Cited by 28 (9 self)
 Add to MetaCart
We provide a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. We argue that such analysis is necessary for placing new compilation approaches within the context of existing ones.
A New Method for Consequence Finding and Compilation in Restricted Languages
, 1999
"... SFK (skipfiltered, kernel) resolution is a new method for finding "interesting" consequences of a first order clausal theory \Sigma, namely those in some restricted target language LT . In its more restrictive form, SFK resolution corresponds to a relatively efficient SAT method, directional resolu ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
SFK (skipfiltered, kernel) resolution is a new method for finding "interesting" consequences of a first order clausal theory \Sigma, namely those in some restricted target language LT . In its more restrictive form, SFK resolution corresponds to a relatively efficient SAT method, directional resolution; in its more general form, to a full prime implicate algorithm, namely Tison 's. It generalizes both of them by offering much more flexible search, first order completeness, and a much wider range of inferential capabilities. SFK resolution has many applications: computing "characteristic" clauses for taskspecific languages in abduction, explanation and nonmonotonic reasoning (Inoue 1992); obtaining LUB approximations of the input theory (Selman and Kautz 1996) which are of polynomial size; incremental and lazy exact knowledge compilation (del Val 1994); and compilation into a tractable form for restricted target languages, independently of the tractability of inference in the given ...