Results 1  10
of
789
Measuring Expectations
, 2004
"... This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress ..."
Abstract

Cited by 134 (8 self)
 Add to MetaCart
This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress
Game Theory, Maximum Entropy, Minimum Discrepancy And Robust Bayesian Decision Theory
 ANNALS OF STATISTICS
, 2004
"... ..."
Towards a unified theory of imprecise probability
 Int. J. Approx. Reasoning
, 2000
"... Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to represent some common types of uncertainty. Coherent lower previsions and sets of probability measures are considerably more general but they may not be sufficiently informative for some purposes. I discuss two other models for uncertainty, involving sets of desirable gambles and partial preference orderings. These are more informative and more general than the previous models, and they may provide a suitable mathematical setting for a unified theory of imprecise probability.
Belief Functions and Default Reasoning
, 2000
"... We present a new approach to deal with default information based on the theory of belief functions. Our semantic structures, inspired by Adams' epsilon semantics, are epsilonbelief assignments, where mass values are either close to 0 or close to 1. In the first part of this paper, we show t ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
We present a new approach to deal with default information based on the theory of belief functions. Our semantic structures, inspired by Adams' epsilon semantics, are epsilonbelief assignments, where mass values are either close to 0 or close to 1. In the first part of this paper, we show that these structures can be used to give a uniform semantics to several popular nonmonotonic systems, including Kraus, Lehmann and Magidor's system P, Pearl's system Z, Brewka's preferred subtheories, Geffner's conditional entailment, Pinkas' penalty logic, possibilistic logic and the lexicographic approach. In the second part, we use epsilonbelief assignments to build a new system, called LCD, and show that this system correctly addresses the wellknown problems of specificity, irrelevance, blocking of inheritance, ambiguity, and redundancy.
Supremum Preserving Upper Probabilities
, 1998
"... We study the relation between possibility measures and the theory of imprecise probabilities, and argue that possibility measures have an important part in this theory. It is shown that a possibility measure is a coherent upper probability if and only if it is normal. A detailed comparison is giv ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
We study the relation between possibility measures and the theory of imprecise probabilities, and argue that possibility measures have an important part in this theory. It is shown that a possibility measure is a coherent upper probability if and only if it is normal. A detailed comparison is given between the possibilistic and natural extension of an upper probability, both in the general case and for upper probabilities dened on a class of nested sets. We prove in particular that a possibility measure is the restriction to events of the natural extension of a special kind of upper probability, dened on a class of nested sets. We show that possibilistic extension can be interpreted in terms of natural extension. We also prove that when either the upper or the lower cumulative distribution function of a random quantity is specied, possibility measures very naturally emerge as the corresponding natural extensions. Next, we go from upper probabilities to upper previsions...
Data Fusion in the Transferable Belief Model.
, 2000
"... When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of distinctness has been assessed. We will present practical applications where the fusion of uncertain data is well achieved by Dempster's rule of combination. It is essential that the meaning of the belief functions used to represent uncertainty be well fixed, as the adequacy of the rule depends strongly on a correct understanding of the context in which they are applied. Missing to distinguish between the upper and lower probabilities theory and the transferable belief model can lead to serious confusion, as Dempster's rule of combination is central in the transferable belief model whereas it hardly fits with the upper and lower probabilities theory. Keywords: belief function, transferable beli...
Updating Beliefs with Incomplete Observations
"... Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that co ..."
Abstract

Cited by 33 (12 self)
 Add to MetaCart
Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that commonly used updating strategies fail in this case, except under very special assumptions. In this paper we propose a new method for updating probabilities with incomplete observations. Our approach is deliberately conservative: we make no assumptions about the socalled incompleteness mechanism that associates complete with incomplete observations. We model our ignorance about this mechanism by a vacuous lower prevision, a tool from the theory of imprecise probabilities, and we use only coherence arguments to turn prior into posterior (updated) probabilities. In general, this new approach to updating produces lower and upper posterior probabilities and previsions (expectations), as well as partially determinate decisions. This is a logical consequence of the existing ignorance about the incompleteness mechanism. As an example, we use the new updating method to properly address the apparent paradox in the `Monty Hall' puzzle. More importantly, we apply it to the problem of classification of new evidence in probabilistic expert systems, where it leads to a new, socalled conservative updating rule.