Results 1  10
of
28
Nonmonotonic Reasoning, Conditional Objects and Possibility Theory
 Artificial Intelligence
, 1997
"... . This short paper relates the conditional objectbased and possibility theorybased approaches for reasoning with conditional statements pervaded with exceptions, to other methods in nonmonotonic reasoning which have been independently proposed: namely, Lehmann's preferential and rational closure en ..."
Abstract

Cited by 68 (17 self)
 Add to MetaCart
. This short paper relates the conditional objectbased and possibility theorybased approaches for reasoning with conditional statements pervaded with exceptions, to other methods in nonmonotonic reasoning which have been independently proposed: namely, Lehmann's preferential and rational closure entailments which obey normative postulates, the infinitesimal probability approach, and the conditional (modal) logicsbased approach. All these methods are shown to be equivalent with respect to their capabilities for reasoning with conditional knowledge although they are based on different modeling frameworks. It thus provides a unified understanding of nonmonotonic consequence relations. More particularly, conditional objects, a purely qualitative counterpart to conditional probabilities, offer a very simple semantics, based on a 3valued calculus, for the preferential entailment, while in the purely ordinal setting of possibility theory both the preferential and the rational closure entai...
Probabilistic Default Reasoning with Conditional Constraints
 ANN. MATH. ARTIF. INTELL
, 2000
"... We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, and conditional entailment for conditional constraints, which are probabilistic generalizations of Pearl's entailment in system , Lehmann's lexicographic entailment, and Geffner's conditional entailment, respectively. We show that the new formalisms have nice properties. In particular, they show a similar behavior as referenceclass reasoning in a number of uncontroversial examples. The new formalisms, however, also avoid many drawbacks of referenceclass reasoning. More precisely, they can handle complex scenarios and even purely probabilistic subjective knowledge as input. Moreover, conclusions are drawn in a global way from all the available knowledge as a whole. We then show that the new formalisms also have nice general nonmonotonic properties. In detail, the new notions of , lexicographic, and conditional entailment have similar properties as their classical counterparts. In particular, they all satisfy the rationality postulates proposed by Kraus, Lehmann, and Magidor, and they have some general irrelevance and direct inference properties. Moreover, the new notions of  and lexicographic entailment satisfy the property of rational monotonicity. Furthermore, the new notions of , lexicographic, and conditional entailment are proper generalizations of both their classical counterparts and the classical notion of logical entailment for conditional constraints. Finally, we provide algorithms for reasoning under the new formalisms, and we analyze its computational com...
Qualitative decision theory: from Savage’s axioms to nonmonotonic reasoning
 Journal of the ACM
, 2002
"... Abstract: This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
Abstract: This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility nor uncertainty, and without using any scale on which both uncertainty and preference could be mapped. Our approach is a variant of Savage's where the setting is finite, and the strict preference on acts is a partial order. It is shown that although many axioms of Savage theory are preserved and despite the intuitive appeal of the ordinal method for constructing a preference over acts, the approach is inconsistent with a probabilistic representation of uncertainty. The latter leads to the kind of paradoxes encountered in the theory of voting. It is shown that the assumption of ordinal invariance enforces a qualitative decision procedure that presupposes a comparative possibility representation of uncertainty, originally due to Lewis, and usual in nonmonotonic reasoning. Our axiomatic investigation thus provides decisiontheoretic foundations to preferential inference of Lehmann and colleagues. However, the obtained decision rules are sometimes either not very decisive or may lead to overconfident decisions, although their basic principles look sound. This paper points out some limitations of purely ordinal approaches to Savagelike decision making under uncertainty, in perfect analogy with similar difficulties in voting theory.
What Are Fuzzy Rules and How to Use Them
 Fuzzy Sets and Systems
, 1996
"... Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy control. The paper ..."
Abstract

Cited by 31 (12 self)
 Add to MetaCart
Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy control. The paper is a survey of different possible semantics for a fuzzy rule and shows how they can be captured in the framework of fuzzy set and possibility theory. It is pointed out that the interpretation of fuzzy rules dictates the way the fuzzy rules should be combined. The various kinds of fuzzy rules considered in the paper (gradual rules, certainty rules, possibility rules, and others) have different inference behaviors and correspond to various intended uses and applications. The representation of fuzzy unlessrules is briefly investigated on the basis of their intended meaning. The problem of defining and checking the coherence of a block of parallel fuzzy rules is also briefly addressed. This iss...
A systematic approach to the assessment of fuzzy association rules. Data Mining and Knowledge Discovery
, 2006
"... In order to allow for the analysis of data sets including numerical attributes, several generalizations of association rule mining based on fuzzy sets have been proposed in the literature. While the formal specification of fuzzy associations is more or less straightforward, the assessment of such ru ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
In order to allow for the analysis of data sets including numerical attributes, several generalizations of association rule mining based on fuzzy sets have been proposed in the literature. While the formal specification of fuzzy associations is more or less straightforward, the assessment of such rules by means of appropriate quality measures is less obvious. Particularly, it assumes an understanding of the semantic meaning of a fuzzy rule. This aspect has been ignored by most existing proposals, which must therefore be considered as adhoc to some extent. In this paper, we develop a systematic approach to the assessment of fuzzy association rules. To this end, we proceed from the idea of partitioning the data stored in a database into examples of a given rule, counterexamples, and irrelevant data. Evaluation measures are then derived from the cardinalities of the corresponding subsets. The problem of finding a proper partition has a rather obvious solution for standard association rules but becomes less trivial in the fuzzy case. Our results not only provide a sound justification for commonly used measures but also suggest a means for constructing meaningful alternatives. 1.
Representing partial ignorance
 IEEE Trans. on Systems, Man and Cybernetics
, 1996
"... Ignorance is precious, for once lost it can never be regained. This paper advocates the use of nonpurely probabilistic approaches to higherorder uncertainty. One of the major arguments of Bayesian probability proponents is that representing uncertainty is always decisiondriven and as a consequenc ..."
Abstract

Cited by 28 (9 self)
 Add to MetaCart
Ignorance is precious, for once lost it can never be regained. This paper advocates the use of nonpurely probabilistic approaches to higherorder uncertainty. One of the major arguments of Bayesian probability proponents is that representing uncertainty is always decisiondriven and as a consequence, uncertainty should be represented by probability. Here we argue that representing partial ignorance is not always decisiondriven. Other reasoning tasks such as belief revision for instance are more naturally carried out at the purely cognitive level. Conceiving knowledge representation and decisionmaking as separate concerns opens the way to nonpurely probabilistic representations of incomplete knowledge. It is pointed out that within a numerical framework, two numbers are needed to account for partial ignorance about events, because on top of truth and falsity, the state of total ignorance must be encoded independently of the number of underlying alternatives. The paper also points out that it is consistent to accept a Bayesian view of decisionmaking and a nonBayesian view of knowledge representation because it is possible to map nonprobabilistic degrees of belief to betting probabilities when needed. Conditioning rules in nonBayesian settings are reviewed,
Probabilistic Logic under Coherence, ModelTheoretic Probabilistic Logic, and Default Reasoning
 Journal of Applied NonClassical Logics
"... We study probabilistic logic under the viewpoint of the coherence principle of de Finetti. In detail, we explore the relationship between coherencebased and modeltheoretic probabilistic logic. Interestingly, we show that the notions of gcoherence and of gcoherent entailment can be expressed by co ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
We study probabilistic logic under the viewpoint of the coherence principle of de Finetti. In detail, we explore the relationship between coherencebased and modeltheoretic probabilistic logic. Interestingly, we show that the notions of gcoherence and of gcoherent entailment can be expressed by combining notions in modeltheoretic probabilistic logic with concepts from default reasoning. Crucially, we even show that probabilistic reasoning under coherence is a probabilistic generalization of default reasoning in system P. That is, we provide a new probabilistic semantics for system P, which is neither based on infinitesimal probabilities nor on atomicbound (or also bigstepped) probabilities. These results also give new insight into default reasoning with conditional objects.
Default Reasoning from Conditional Knowledge Bases: Complexity and Tractable Cases
 Artif. Intell
, 2000
"... Conditional knowledge bases have been proposed as belief bases that include defeasible rules (also called defaults) of the form " ! ", which informally read as "generally, if then ." Such rules may have exceptions, which can be handled in different ways. A number of entailment semantics for condi ..."
Abstract

Cited by 21 (13 self)
 Add to MetaCart
Conditional knowledge bases have been proposed as belief bases that include defeasible rules (also called defaults) of the form " ! ", which informally read as "generally, if then ." Such rules may have exceptions, which can be handled in different ways. A number of entailment semantics for conditional knowledge bases have been proposed in the literature. However, while the semantic properties and interrelationships of these formalisms are quite well understood, about their computational properties only partial results are known so far. In this paper, we fill these gaps and first draw a precise picture of the complexity of default reasoning from conditional knowledge bases: Given a conditional knowledge base KB and a default ! , does KB entail ! ? We classify the complexity of this problem for a number of wellknown approaches (including Goldszmidt et al.'s maximum entropy approach and Geffner's conditional entailment), where we consider the general propositional case as well as natural syntactic restrictions (in particular, to Horn and literalHorn conditional knowledge bases). As we show, the more sophisticated semantics for conditional knowledge bases are plagued with intractability in all these fragments. We thus explore cases in which these semantics are tractable, and find that most of them enjoy this property on feedbackfree Horn conditional knowledge bases, which constitute a new, meaningful class of conditional knowledge bases. Furthermore, we generalize previous tractability results from Horn to qHorn conditional knowledge bases, which allow for a limited use of disjunction. Our results complement and extend previous results, and contribute in refining the tractability/intractability frontier of default reasoning from conditional know...
Weak nonmonotonic probabilistic logics
"... Towards probabilistic formalisms for resolving local inconsistencies under modeltheoretic probabilistic entailment, we present probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment. We then analyze the nonmonotonic and semantic properties of the new ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
Towards probabilistic formalisms for resolving local inconsistencies under modeltheoretic probabilistic entailment, we present probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment. We then analyze the nonmonotonic and semantic properties of the new notions of entailment. In particular, we show that they satisfy the rationality postulates of System P and the property of Rational Monotonicity. Moreover, we show that modeltheoretic probabilistic entailment is stronger than the new notion of lexicographic entailment, which in turn is stronger than the new notion of entailment in System Z. As an important feature of the new notions of entailment in System Z and lexicographic entailment, we show that they coincide with modeltheoretic probabilistic entailment whenever there are no local inconsistencies. We also show that the new notions of entailment in System Z and lexicographic entailment are proper generalizations of their classical counterparts. Finally, we present algorithms for reasoning under the new formalisms, and we give a precise picture of its computational complexity.
Focusing vs. belief revision: A fundamental distinction when dealing with generic knowledge
 Proceedings 5th International Joint Conference on Qualitative and Quantitative Practical Reasoning, number 1244 in Lecture Notes in Artificial Intelligence
, 1997
"... : This paper advocates a basic distinction between two epistemic operations called focusing and revision, which can be defined in any, symbolic or numerical, representation framework which is rich enough for acknowledging the difference between factual evidence and generic knowledge. Revision amount ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
: This paper advocates a basic distinction between two epistemic operations called focusing and revision, which can be defined in any, symbolic or numerical, representation framework which is rich enough for acknowledging the difference between factual evidence and generic knowledge. Revision amounts to modifying the generic knowledge when receiving new pieces of generic knowledge (or the factual evidence when obtaining more factual information), while focusing is just applying the generic knowledge to the reference class of situations which exactly corresponds to all the available evidence gathered on the case under consideration. Various settings are considered, upper and lower probabilities, belief functions, numerical possibility measures, ordinal possibility measures, conditional objects, nonmonotonic consequence relations. 1  Introduction Some basic modes of belief change have been laid bare by Levi (1980): an expansion corresponds to adding the new piece of information withou...