Results 1  10
of
606
Measuring Expectations
, 2004
"... This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress ..."
Abstract

Cited by 128 (8 self)
 Add to MetaCart
This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress
Game Theory, Maximum Entropy, Minimum Discrepancy And Robust Bayesian Decision Theory
 ANNALS OF STATISTICS
, 2004
"... ..."
Robust Learning with Missing Data
, 1996
"... Bayesian methods are becoming increasingly popular in the development of intelligent machines. Bayesian Belief Networks (bbns) are nowaday a prominent reasoning method and, during the past few years, several efforts have been addressed to develop methods able to learn bbns directly from databases. H ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
Bayesian methods are becoming increasingly popular in the development of intelligent machines. Bayesian Belief Networks (bbns) are nowaday a prominent reasoning method and, during the past few years, several efforts have been addressed to develop methods able to learn bbns directly from databases. However, all these methods assume that the database is complete or, at least, that unreported data are missing at random. Unfortunately, realworld databases are rarely complete and the "Missing at Random" assumption is often unrealistic. This paper shows that this assumption can dramatically affect the reliability of the learned bbn and introduces a robust method to learn conditional probabilities in a bbn, which does not rely on this assumption. In order to drop this assumption, we have to change the overall learning strategy used by traditional Bayesian methods: our method bounds the set of all posterior probabilities consistent with the database and proceed by refining this set as more i...
Towards a unified theory of imprecise probability
 Int. J. Approx. Reasoning
, 2000
"... Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to represent some common types of uncertainty. Coherent lower previsions and sets of probability measures are considerably more general but they may not be sufficiently informative for some purposes. I discuss two other models for uncertainty, involving sets of desirable gambles and partial preference orderings. These are more informative and more general than the previous models, and they may provide a suitable mathematical setting for a unified theory of imprecise probability.
Belief Functions and Default Reasoning
, 2000
"... We present a new approach to deal with default information based on the theory of belief functions. Our semantic structures, inspired by Adams' epsilon semantics, are epsilonbelief assignments, where mass values are either close to 0 or close to 1. In the first part of this paper, we show that t ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
We present a new approach to deal with default information based on the theory of belief functions. Our semantic structures, inspired by Adams' epsilon semantics, are epsilonbelief assignments, where mass values are either close to 0 or close to 1. In the first part of this paper, we show that these structures can be used to give a uniform semantics to several popular nonmonotonic systems, including Kraus, Lehmann and Magidor's system P, Pearl's system Z, Brewka's preferred subtheories, Geffner's conditional entailment, Pinkas' penalty logic, possibilistic logic and the lexicographic approach. In the second part, we use epsilonbelief assignments to build a new system, called LCD, and show that this system correctly addresses the wellknown problems of specificity, irrelevance, blocking of inheritance, ambiguity, and redundancy.
Supremum Preserving Upper Probabilities
, 1998
"... We study the relation between possibility measures and the theory of imprecise probabilities, and argue that possibility measures have an important part in this theory. It is shown that a possibility measure is a coherent upper probability if and only if it is normal. A detailed comparison is giv ..."
Abstract

Cited by 38 (12 self)
 Add to MetaCart
We study the relation between possibility measures and the theory of imprecise probabilities, and argue that possibility measures have an important part in this theory. It is shown that a possibility measure is a coherent upper probability if and only if it is normal. A detailed comparison is given between the possibilistic and natural extension of an upper probability, both in the general case and for upper probabilities dened on a class of nested sets. We prove in particular that a possibility measure is the restriction to events of the natural extension of a special kind of upper probability, dened on a class of nested sets. We show that possibilistic extension can be interpreted in terms of natural extension. We also prove that when either the upper or the lower cumulative distribution function of a random quantity is specied, possibility measures very naturally emerge as the corresponding natural extensions. Next, we go from upper probabilities to upper previsions...
The case for objective Bayesian analysis
 Bayesian Analysis
, 2006
"... Abstract. Bayesian statistical practice makes extensive use of versions of objective Bayesian analysis. We discuss why this is so, and address some of the criticisms that have been raised concerning objective Bayesian analysis. The dangers of treating the issue too casually are also considered. In p ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Abstract. Bayesian statistical practice makes extensive use of versions of objective Bayesian analysis. We discuss why this is so, and address some of the criticisms that have been raised concerning objective Bayesian analysis. The dangers of treating the issue too casually are also considered. In particular, we suggest that the statistical community should accept formal objective Bayesian techniques with confidence, but should be more cautious about casual objective Bayesian techniques.
Data Fusion in the Transferable Belief Model.
, 2000
"... When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of dist ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of distinctness has been assessed. We will present practical applications where the fusion of uncertain data is well achieved by Dempster's rule of combination. It is essential that the meaning of the belief functions used to represent uncertainty be well fixed, as the adequacy of the rule depends strongly on a correct understanding of the context in which they are applied. Missing to distinguish between the upper and lower probabilities theory and the transferable belief model can lead to serious confusion, as Dempster's rule of combination is central in the transferable belief model whereas it hardly fits with the upper and lower probabilities theory. Keywords: belief function, transferable beli...
Updating Beliefs with Incomplete Observations
"... Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that co ..."
Abstract

Cited by 32 (10 self)
 Add to MetaCart
Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that commonly used updating strategies fail in this case, except under very special assumptions. In this paper we propose a new method for updating probabilities with incomplete observations. Our approach is deliberately conservative: we make no assumptions about the socalled incompleteness mechanism that associates complete with incomplete observations. We model our ignorance about this mechanism by a vacuous lower prevision, a tool from the theory of imprecise probabilities, and we use only coherence arguments to turn prior into posterior (updated) probabilities. In general, this new approach to updating produces lower and upper posterior probabilities and previsions (expectations), as well as partially determinate decisions. This is a logical consequence of the existing ignorance about the incompleteness mechanism. As an example, we use the new updating method to properly address the apparent paradox in the `Monty Hall' puzzle. More importantly, we apply it to the problem of classification of new evidence in probabilistic expert systems, where it leads to a new, socalled conservative updating rule.
Learning under Ambiguity
 Review of Economic Studies
, 2002
"... This paper considers learning when the distinction between risk and ambiguity matters. It first describes thought experiments, dynamic variants of those provided by Ellsberg, that highlight a sense in which the Bayesian learning model is extremeit models agents who are implausibly ambitious about w ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
This paper considers learning when the distinction between risk and ambiguity matters. It first describes thought experiments, dynamic variants of those provided by Ellsberg, that highlight a sense in which the Bayesian learning model is extremeit models agents who are implausibly ambitious about what they can learn in complicated environments. The paper then provides a generalization of the Bayesian model that accommodates the intuitive choices in the thought experiments. In particular, the model allows decisionmakers ’ confidence about the environment to change — along with beliefs — as they learn. A portfolio choice application compares the effect of changes in confidence under ambiguity versus changes in estimation risk under Bayesian learning. The former is shown to induce a trend towards more stock market participation and investment even when the latter does not. 1