Results 1  10
of
11
Two views of belief: Belief as generalized probability and belief as evidence
, 1992
"... : Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized prob ..."
Abstract

Cited by 72 (12 self)
 Add to MetaCart
: Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized probability function (which technically corresponds to the inner measure induced by a probability function). The second is as a way of representing evidence. Evidence, in turn, can be understood as a mapping from probability functions to probability functions. It makes sense to think of updating a belief if we think of it as a generalized probability. On the other hand, it makes sense to combine two beliefs (using, say, Dempster's rule of combination) only if we think of the belief functions as representing evidence. Many previous papers have pointed out problems with the belief function approach; the claim of this paper is that these problems can be explained as a consequence of confounding the...
Updating Probabilities
, 2002
"... As examples such as the Monty Hall puzzle show, applying conditioning to update a probability distribution on a "naive space", which does not take into account the protocol used, can often lead to counterintuitive results. Here we examine why. A criterion known as CAR ("coarsening a ..."
Abstract

Cited by 53 (6 self)
 Add to MetaCart
As examples such as the Monty Hall puzzle show, applying conditioning to update a probability distribution on a "naive space", which does not take into account the protocol used, can often lead to counterintuitive results. Here we examine why. A criterion known as CAR ("coarsening at random") in the statistical literature characterizes when "naive" conditioning in a naive space works. We show that the CAR condition holds rather infrequently, and we provide a procedural characterization of it, by giving a randomized algorithm that generates all and only distributions for which CAR holds. This substantially extends previous characterizations of CAR. We also consider more generalized notions of update such as Jeffrey conditioning and minimizing relative entropy (MRE). We give a generalization of the CAR condition that characterizes when Jeffrey conditioning leads to appropriate answers, and show that there exist some very simple settings in which MRE essentially never gives the right results. This generalizes and interconnects previous results obtained in the literature on CAR and MRE.
A New Approach to Updating Beliefs
 Uncertainty in Artificial Intelligence
, 1991
"... : We define a new notion of conditional belief, which plays the same role for DempsterShafer belief functions as conditional probability does for probability functions. Our definition is different from the standard definition given by Dempster, and avoids many of the wellknown problems of that def ..."
Abstract

Cited by 47 (6 self)
 Add to MetaCart
: We define a new notion of conditional belief, which plays the same role for DempsterShafer belief functions as conditional probability does for probability functions. Our definition is different from the standard definition given by Dempster, and avoids many of the wellknown problems of that definition. Just as the conditional probability P r(\DeltajB) is a probability function which is the result of conditioning on B being true, so too our conditional belief function Bel(\DeltajB) is a belief function which is the result of conditioning on B being true. We define the conditional belief as the lower envelope (that is, the inf) of a family of conditional probability functions, and provide a closedform expression for it. An alternate way of understanding our definition of conditional belief is provided by considering ideas from an earlier paper [FH91], where we connect belief functions with inner measures. In particular, we show here how to extend the definition of conditional pro...
Uncertainty, Belief, and Probability
 Computational Intelligence
, 1989
"... : We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
: We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and outer measure of the event. In addition to removing the requirement that every event be assigned a probability, our approach circumvents other criticisms of probabilitybased approaches to uncertainty. For example, the measure of belief in an event turns out to be represented by an interval (defined by the inner and outer measure), rather than by a single number. Further, this approach allows us to assign a belief (inner measure) to an event E without committing to a belief about its negation :E (since the inner measure of an event plus the inner measure of its negation is not necessarily one). Interestingly enough, inner measures induced by probability measures turn out to correspond in a ...
Reasoning with Belief Functions: An Analysis of Compatibility
 International Journal of Approximate Reasoning
, 1990
"... This paper examines the applicability of belief functions methodology in three reasoning tasks: (1) representation of incomplete knowledge, (2) belief updating, and (3) evidence pooling. We find that belief functions have difficulties representing incomplete knowledge, primarily knowledge expressed ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
This paper examines the applicability of belief functions methodology in three reasoning tasks: (1) representation of incomplete knowledge, (2) belief updating, and (3) evidence pooling. We find that belief functions have difficulties representing incomplete knowledge, primarily knowledge expressed in conditional sentences. In this context, we also show that the prevailing practices of encoding ifthen rules as belief function expressions are inadequate, as they lead to counterintuitive conclusions under chaining, contraposition, and reasoning by cases. Next, we examine the role of belief functions in updating states of belief and find that, if partial knowledge is encoded and updated by belief function methods, the updating process violates basic patterns of plausibility and the resulting beliefs cannot serve as a basis for rational decisions. Finally, assessing their role in evidence pooling, we find that belief functions offer a rich language for describing the evidence gathered, highly compatible with the way people summarize observations. However, the methods available for integrating evidence into a coherent state of belief ca
Ploxoma: Testbed for Uncertain Inference
, 1994
"... This paper compares two formalisms for uncertain inference, Kyburg's Combinatorial Semantics and DempsterShafer belief function theory, on the basis of an example from the domain of medical diagnosis. I review Shafer's example about the imaginary disease ploxoma and show how it would be r ..."
Abstract
 Add to MetaCart
This paper compares two formalisms for uncertain inference, Kyburg's Combinatorial Semantics and DempsterShafer belief function theory, on the basis of an example from the domain of medical diagnosis. I review Shafer's example about the imaginary disease ploxoma and show how it would be represented in Combinatorial Semantics. I conclude that belief function theory has a qualitative advantage because it offers greater flexibility of expression, and provides results about more specific classes of patients. Nevertheless, a quantitative comparison reveals that the inferences sanctioned by Combinatorial Semantics are more reliable than those of belief function theory. 5.1 Introduction This paper compares Kyburg's Combinatorial Semantics (CS) and DempsterShafer belief function theory (BFT), two formalisms for representing uncertain inference from statistical data. I reconsider a medical diagnosis example introduced by Shafer [Shafer82] concerning the imaginary disease ploxoma. In Section ...
In The Blackwell Guide to the Philosophy of Science, eds. Peter Machamer and Michael
"... We will discuss induction and probability in that order, aiming to bring out the deep interconnections between the two topics; we will close with a brief overview of cuttingedge research that combines them. 1. Induction: some preliminaries Arguably, Hume's greatest single contribution to conte ..."
Abstract
 Add to MetaCart
We will discuss induction and probability in that order, aiming to bring out the deep interconnections between the two topics; we will close with a brief overview of cuttingedge research that combines them. 1. Induction: some preliminaries Arguably, Hume's greatest single contribution to contemporary philosophy of science has been the problem of induction (1739). Before attempting its statement, we need to spend a few words identifying the subject matter of this corner of epistemology. At a first pass, induction concerns ampliative inferences drawn on the basis of evidence (presumably, evidence acquired more or less directly from experience)—that is, inferences whose conclusions are not (validly) entailed by the premises. Philosophers have historically drawn further distinctions, often appropriating the term “induction ” to mark them; since we will not be concerned with the philosophical issues for which these distinctions are relevant, we will use the word “inductive ” in a catchall sense synonymous with “ampliative”. But we will follow the usual practice of choosing, as our paradigm example of inductive inferences, inferences about the future based on evidence drawn from the past and present.
CEMAGREF Information Fusion Systems
"... Abstract—In this paper, we present an extension of the multicriteria decision making based on the Analytic Hierarchy Process (AHP) which incorporates uncertain knowledge matrices for generating basic belief assignments (bba’s). The combination of priority vectors corresponding to bba’s related to ea ..."
Abstract
 Add to MetaCart
Abstract—In this paper, we present an extension of the multicriteria decision making based on the Analytic Hierarchy Process (AHP) which incorporates uncertain knowledge matrices for generating basic belief assignments (bba’s). The combination of priority vectors corresponding to bba’s related to each (sub)criterion is performed using the Proportional Conflict Redistribution rule no. 5 proposed in DezertSmarandache Theory (DSmT) of plausible and paradoxical reasoning. The method presented here, called DSmTAHP, is illustrated on very simple examples.
Elsevier Two views of belief:
, 1990
"... belief as generalized probability and belief as evidence ..."