Results 1 
8 of
8
Tractable Reasoning via Approximation
 Artificial Intelligence
, 1995
"... Problems in logic are wellknown to be hard to solve in the worst case. Two different strategies for dealing with this aspect are known from the literature: language restriction and theory approximation. In this paper we are concerned with the second strategy. Our main goal is to define a semantical ..."
Abstract

Cited by 104 (0 self)
 Add to MetaCart
(Show Context)
Problems in logic are wellknown to be hard to solve in the worst case. Two different strategies for dealing with this aspect are known from the literature: language restriction and theory approximation. In this paper we are concerned with the second strategy. Our main goal is to define a semantically wellfounded logic for approximate reasoning, which is justifiable from the intuitive point of view, and to provide fast algorithms for dealing with it even when using expressive languages. We also want our logic to be useful to perform approximate reasoning in different contexts. We define a method for the approximation of decision reasoning problems based on multivalued logics. Our work expands and generalizes in several directions ideas presented by other researchers. The major features of our technique are: 1) approximate answers give semantically clear information about the problem at hand; 2) approximate answers are easier to compute than answers to the original problem; 3) approxim...
A Survey on Complexity Results for Nonmonotonic Logics
 Journal of Logic Programming
, 1993
"... This paper surveys the main results appeared in the literature on the computational complexity of nonmonotonic inference tasks. We not only give results about the tractability/intractability of the individual problems but we also analyze sources of complexity and explain intuitively the nature of e ..."
Abstract

Cited by 85 (6 self)
 Add to MetaCart
(Show Context)
This paper surveys the main results appeared in the literature on the computational complexity of nonmonotonic inference tasks. We not only give results about the tractability/intractability of the individual problems but we also analyze sources of complexity and explain intuitively the nature of easy/hard cases. We focus mainly on nonmonotonic formalisms, like default logic, autoepistemic logic, circumscription, closedworld reasoning and abduction, whose relations with logic programming are clear and well studied. Complexity as well as recursiontheoretic results are surveyed. Work partially supported by the ESPRIT Basic Research Action COMPULOG and the Progetto Finalizzato Informatica of the CNR (Italian Research Council). The first author is supported by a CNR scholarship 1 Introduction Nonmonotonic logics and negation as failure in logic programming have been defined with the goal of providing formal tools for the representation of default information. One of the ideas und...
Do Computers Need Common Sense?
 Proceedings of the Fifth International Conference on Knowledge Representation
, 1996
"... My aim in this paper is to make and defend three claims. First, it is incumbent on the knowledge representation and nonmonotonic communities to demonstrate that their ideas will eventually lead to improvements in the performance of implemented systems. Second, a reasonable working definition of &quo ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
My aim in this paper is to make and defend three claims. First, it is incumbent on the knowledge representation and nonmonotonic communities to demonstrate that their ideas will eventually lead to improvements in the performance of implemented systems. Second, a reasonable working definition of "commonsense " reasoning is that it is the process of using polynomial techniques to convert a large instance of an NPhard problem to a smaller instance on which search techniques can be applied effectively. And finally, it is a consequence of these first two claims that the most pressing problem facing the commonsense community is the identification of realistic problems and problem structures for which commonsense reductions are both necessary and effective. 1 INTRODUCTION One might study formal aspects of knowledge representation for at least two reasons. In the first case, the elegance of the associated theories might itself be compelling; formal theories of reasoning might attract interes...
A NonDeterministic Semantics for Tractable Inference
, 1998
"... Unit resolution is arguably the most useful known algorithm for tractable reasoning in propositional logic. Intuitively, if one knows a, b, and a b oe c, then c should be an obvious implication. However, devising a tractable semantics that allows unit resolution has proven to be an elusive goal. W ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Unit resolution is arguably the most useful known algorithm for tractable reasoning in propositional logic. Intuitively, if one knows a, b, and a b oe c, then c should be an obvious implication. However, devising a tractable semantics that allows unit resolution has proven to be an elusive goal. We propose a 3valued semantics for a tractable fragment of propositional logic that is inherently nondeterministic: the denotation of a formula is not uniquely determined by the denotation of the variables it contains. We show that this semantics yields a tractable, sound and complete, decision procedure. We generalize this semantics to a family of semantics, tied to Dalal's notion of intricacy, of increasing deductive power and computational complexity.
A Comparison of Two Approaches to Splitting Default Theories
 In AAAI/IAAI
, 1997
"... Default logic is computationally expensive. One of the most promising ways of easing this problem and developing powerful implementations is to split a default theory into smaller parts and compute extensions in a modular, "local" way. This paper compares two recent approaches, Turner&apos ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Default logic is computationally expensive. One of the most promising ways of easing this problem and developing powerful implementations is to split a default theory into smaller parts and compute extensions in a modular, "local" way. This paper compares two recent approaches, Turner's splitting and Cholewinski's stratification. It shows that the approaches are closely related  in fact the former can be viewed as a special case of the latter. 1 Introduction Default logic (Reiter 1980) is one of the most prominent approaches of nonmonotonic reasoning, since it provides a formal theory of reasoning based on default rules. One of the main problems with its applicability is that it is computationally harder than classical logic (Marek and Truszczynski 1993, Gottlob 1992), which makes the implementation of powerful systems difficult. A possible solution to this problem might be to split the available knowledge into smaller parts, and to apply default reasoning in a local way. This idea...
1A Other Organizations Involved as Partners NONE 1B Other Collaborators or Contacts
"... Our objective was to study a novel, approximationbased, approach to tractable nonmonotonic reasoning. Despite the critical role of nonmonotonicity in most human reasoning tasks, current nonmonotonic reasoning formalisms are inherently undecidable in the general case, and are intractable in all but ..."
Abstract
 Add to MetaCart
Our objective was to study a novel, approximationbased, approach to tractable nonmonotonic reasoning. Despite the critical role of nonmonotonicity in most human reasoning tasks, current nonmonotonic reasoning formalisms are inherently undecidable in the general case, and are intractable in all but the most restrictive cases. To address this intractability, we combined two known, weak, techniquesâ€”contextlimited reasoning and fast, incomplete consistency testingâ€”to develop a powerful, tractable, approximation mechanism. Neither of these techniques, alone, suffices. Since consistency is undecidable in the firstorder case, context limited reasoning does not, by itself, guarantee tractability. Furthermore, known fast, incomplete consistency tests generally fail in realisticallycomplex knowledge bases. The goal of this research was to show that the combination of the two techniques produces a synergism that yields a tractable approximate nonmonotonic reasoning mechanism that overcomes the limitations of either technique alone. We studied the conditions under which this approach gives reasonable results and