Results 1  10
of
101
Bayesian Data Analysis
, 1995
"... I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng and Ste ..."
Abstract

Cited by 1230 (47 self)
 Add to MetaCart
I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng and Stern, 2006). I do, however, feel that it is important to understand where our probability models come from, and I welcome the opportunity to use the present article by Robert, Chopin and Rousseau as a platform for further discussion of foundational issues. 2 In this brief discussion I will argue the following: (1) in thinking about prior distributions, we should go beyond Jeffreys’s principles and move toward weakly informative priors; (2) it is natural for those of us who work in social and computational sciences to favor complex models, contra Jeffreys’s preference for simplicity; and (3) a key generalization of Jeffreys’s ideas is to explicitly include model checking in the process of data analysis.
Current Approaches to Handling Imperfect Information in Data and Knowledge Bases
, 1996
"... This paper surveys methods for representing and reasoning with imperfect information. It opens with an attempt to classify the different types of imperfection that may pervade data, and a discussion of the sources of such imperfections. The classification is then used as a framework for considering ..."
Abstract

Cited by 52 (1 self)
 Add to MetaCart
This paper surveys methods for representing and reasoning with imperfect information. It opens with an attempt to classify the different types of imperfection that may pervade data, and a discussion of the sources of such imperfections. The classification is then used as a framework for considering work that explicitly concerns the representation of imperfect information, and related work on how imperfect information may be used as a basis for reasoning. The work that is surveyed is drawn from both the field of databases and the field of artificial intelligence. Both of these areas have long been concerned with the problems caused by imperfect information, and this paper stresses the relationships between the approaches developed in each.
Probabilistic Deduction with Conditional Constraints over Basic Events
 J. Artif. Intell. Res
, 1999
"... We study the problem of probabilistic deduction with conditional constraints over basic events. We show that globally complete probabilistic deduction with conditional constraints over basic events is NPhard. We then concentrate on the special case of probabilistic deduction in conditional constrai ..."
Abstract

Cited by 44 (30 self)
 Add to MetaCart
We study the problem of probabilistic deduction with conditional constraints over basic events. We show that globally complete probabilistic deduction with conditional constraints over basic events is NPhard. We then concentrate on the special case of probabilistic deduction in conditional constraint trees. We elaborate very efficient techniques for globally complete probabilistic deduction. In detail, for conditional constraint trees with point probabilities, we present a local approach to globally complete probabilistic deduction, which runs in linear time in the size of the conditional constraint trees. For conditional constraint trees with interval probabilities, we show that globally complete probabilistic deduction can be done in a global approach by solving nonlinear programs. We show how these nonlinear programs can be transformed into equivalent linear programs, which are solvable in polynomial time in the size of the conditional constraint trees. 1. Introduction Dealing wit...
Probabilistic Default Reasoning with Conditional Constraints
 ANN. MATH. ARTIF. INTELL
, 2000
"... We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, and conditional entailment for conditional constraints, which are probabilistic generalizations of Pearl's entailment in system , Lehmann's lexicographic entailment, and Geffner's conditional entailment, respectively. We show that the new formalisms have nice properties. In particular, they show a similar behavior as referenceclass reasoning in a number of uncontroversial examples. The new formalisms, however, also avoid many drawbacks of referenceclass reasoning. More precisely, they can handle complex scenarios and even purely probabilistic subjective knowledge as input. Moreover, conclusions are drawn in a global way from all the available knowledge as a whole. We then show that the new formalisms also have nice general nonmonotonic properties. In detail, the new notions of , lexicographic, and conditional entailment have similar properties as their classical counterparts. In particular, they all satisfy the rationality postulates proposed by Kraus, Lehmann, and Magidor, and they have some general irrelevance and direct inference properties. Moreover, the new notions of  and lexicographic entailment satisfy the property of rational monotonicity. Furthermore, the new notions of , lexicographic, and conditional entailment are proper generalizations of both their classical counterparts and the classical notion of logical entailment for conditional constraints. Finally, we provide algorithms for reasoning under the new formalisms, and we analyze its computational com...
Some Properties of Joint Probability Distributions
 In Proceedings of the 10th Conference on Uncertainty in Artificial Intelligence (UAI–94
, 1994
"... Several Artificial Intelligence schemes for reasoning under uncertainty explore either explicitly or implicitly asymmetries among probabilities of various states of their uncertain domain models. Even though the correct working of these schemes is practically contingent upon the existence of a ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
Several Artificial Intelligence schemes for reasoning under uncertainty explore either explicitly or implicitly asymmetries among probabilities of various states of their uncertain domain models. Even though the correct working of these schemes is practically contingent upon the existence of a small number of probable states, no formal justification has been proposed of why this should be the case. This paper attempts to fill this apparent gap by studying asymmetries among probabilities of various states of uncertain models. By rewriting the joint probability distribution over a model's variables into a product of individual variables' prior and conditional probability distributions and applying central limit theorem to this product, we can demonstrate that the probabilities of individual states of the model can be expected to be drawn from highly skewed lognormal distributions. With sufficient asymmetry in individual prior and conditional probability distributions, a small fraction of states can be expected to cover a large portion of the total probability space with the remaining states having practically negligible probability. Theoretical discussion is supplemented by simulation results and an illustrative realworld example. 1
SetBased Bayesianism
, 1992
"... . Problems for strict and convex Bayesianism are discussed. A setbased Bayesianism generalizing convex Bayesianism and intervalism is proposed. This approach abandons not only the strict Bayesian requirement of a unique realvalued probability function in any decisionmaking context but also the re ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
. Problems for strict and convex Bayesianism are discussed. A setbased Bayesianism generalizing convex Bayesianism and intervalism is proposed. This approach abandons not only the strict Bayesian requirement of a unique realvalued probability function in any decisionmaking context but also the requirement of convexity for a setbased representation of uncertainty. Levi's Eadmissibility decision criterion is retained and is shown to be applicable in the nonconvex case. Keywords: Uncertainty, decisionmaking, maximum entropy, Bayesian methods. 1. Introduction. The reigning philosophy of uncertainty representation is strict Bayesianism. One of its central principles is that an agent must adopt a single, realvalued probability function over the events recognized as relevant to a given problem. Prescriptions for defining such a function for a given agent in a given situation range from the extreme personalism of deFinetti (1964, 1974) and Savage (1972) to the objective Bayesianism of...
Decision Making in a Context where Uncertainty is Represented by Belief Functions.
, 2000
"... A quantified model to represent uncertainty is incomplete if its use in a decision environment is not explained. When belief functions were first introduced to represent quantified uncertainty, no associated decision model was proposed. Since then, it became clear that the belief functions meani ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
A quantified model to represent uncertainty is incomplete if its use in a decision environment is not explained. When belief functions were first introduced to represent quantified uncertainty, no associated decision model was proposed. Since then, it became clear that the belief functions meaning is multiple. The models based on belief functions could be understood as an upper and lower probabilities model, as the hint model, as the transferable belief model and as a probability model extended to modal propositions. These models are mathematically identical at the static level, their behaviors diverge at their dynamic level (under conditioning and/or revision). For decision making, some authors defend that decisions must be based on expected utilities, in which case a probability function must be determined. When uncertainty is represented by belief functions, the choice of the appropriate probability function must be explained and justified. This probability function doe...
Possibility Theory III: Possibilistic Independence
 International Journal of General Systems
, 1997
"... this paper, I intend to construct a more general theory of possibilistic independence, and at the same time provide a solution for the abovementioned difficulties. My guiding principle in doing so will be the (formal) analogy with probability theory. My tools will be the measureand integraltheoret ..."
Abstract

Cited by 22 (15 self)
 Add to MetaCart
this paper, I intend to construct a more general theory of possibilistic independence, and at the same time provide a solution for the abovementioned difficulties. My guiding principle in doing so will be the (formal) analogy with probability theory. My tools will be the measureand integraltheoretic treatment of possibility theory, developed in parts I and II of this series. At the same time, it will be shown that the definition of possibilistic independence given here has an interesting interpretation when classical possibility is considered. In two interesting and important papers [Benferhat et al., 1994] [Dubois et al., 1994] Benferhat, Dubois, Fari~nas del Cerro, Herzig and Prade discuss the independence of events (or propositions) in possibility theory in a logical setting. In all, they discuss three types of independence. The first is based upon Zadeh's notion of noninteractivity for variables (see also Part II, section 1). In [Dubois et al., 1994] two events A and B are called unrelated in Zadeh's sense iff \Pi(A " B) = min(\Pi(A); \Pi(B)): (5) This definition is essentially the same as the one given by Nahmias, see Eq. (1). In [Benferhat
Subjective Bayesian Analysis: Principle and practice
 BAYESIAN ANALYSIS
, 2006
"... We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.