Results 11  20
of
66
Conservative inference rule for uncertain reasoning under incompleteness
 Journal of Artificial Intelligence Research
"... In this paper we formulate the problem of inference under incomplete information in very general terms. This includes modelling the process responsible for the incompleteness, which we call the incompleteness process. We allow the process ’ behaviour to be partly unknown. Then we use Walley’s theory ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
(Show Context)
In this paper we formulate the problem of inference under incomplete information in very general terms. This includes modelling the process responsible for the incompleteness, which we call the incompleteness process. We allow the process ’ behaviour to be partly unknown. Then we use Walley’s theory of coherent lower previsions, a generalisation of the Bayesian theory to imprecision, to derive the rule to update beliefs under incompleteness that logically follows from our assumptions, and that we call conservative inference rule. This rule has some remarkable properties: it is an abstract rule to update beliefs that can be applied in any situation or domain; it gives us the opportunity to be neither too optimistic nor too pessimistic about the incompleteness process, which is a necessary condition to draw reliable while strong enough conclusions; and it is a coherent rule, in the sense that it cannot lead to inconsistencies. We give examples to show how the new rule can be applied in expert systems, in parametric statistical inference, and in pattern classification, and discuss more generally the view of incompleteness processes defended here as well as some of its consequences. 1.
A Logic for Reasoning about Upper Probabilities
, 2002
"... We present a propositional logic to reason about the uncertainty of events, where the uncertainty is modeled by a set of probability measures assigning an interval of probability to each event. We give a sound and complete axiomatization for the logic, and show that the satisfiability problem is ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We present a propositional logic to reason about the uncertainty of events, where the uncertainty is modeled by a set of probability measures assigning an interval of probability to each event. We give a sound and complete axiomatization for the logic, and show that the satisfiability problem is NPcomplete, no harder than satisfiability for propositional logic.
Ignorability for categorical data
 The Annals of Statistics
"... We study the problem of ignorability in likelihoodbased inference from incomplete categorical data. Two versions of the coarsened at random assumption (car) are distinguished, their compatibility with the parameter distinctness assumption is investigated and several conditions for ignorability that ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We study the problem of ignorability in likelihoodbased inference from incomplete categorical data. Two versions of the coarsened at random assumption (car) are distinguished, their compatibility with the parameter distinctness assumption is investigated and several conditions for ignorability that do not require an extra parameter distinctness assumption are established. It is shown that car assumptions have quite different implications depending on whether the underlying completedata model is saturated or parametric. In the latter case, car assumptions can become inconsistent with observed data. 1. Introduction. In a sequence of papers Rubin [15], Heitjan and Rubin [11] and Heitjan [9, 10] have investigated the question under what conditions a mechanism that causes observed data to be incomplete or, more generally, coarse, can be ignored in the statistical analysis of the data. The key condition that has been identified is that the data should be missing at
A New Understanding of Subjective Probability and Its Generalization to Lower and Upper Prevision
, 2002
"... This article introduces a new wa of understanding subjective probabilit and its generalization to lower and upper prevision. ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
This article introduces a new wa of understanding subjective probabilit and its generalization to lower and upper prevision.
Conservative rules for predictive inference with incomplete data
 ISIPTA ’05, Proceedings of the Fourth International Symposium on Imprecise Probabilities and Their Applications, pages406–415. SIPTA
, 2005
"... This paper addresses the following question: how should we update our beliefs after observing some incomplete data, in order to make credible predictions about new, and possibly incomplete, data? There may be several answers to this question according to the model of the process that creates the inc ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
(Show Context)
This paper addresses the following question: how should we update our beliefs after observing some incomplete data, in order to make credible predictions about new, and possibly incomplete, data? There may be several answers to this question according to the model of the process that creates the incompleteness. This paper develops a rigorous modelling framework that makes it clear the conditions that justify the different answers; and, on this basis, it derives a new conditioning rule for predictive inference to be used in a wide range of states of knowledge about the incompleteness process, including nearignorance, which, surprisingly, does not seem to have received attention so far. Such a case is instead particularly important, as modelling incompleteness processes can be highly impractical, and because there are limitations to statistical inference with incomplete data: it is generally not possible to learn how incompleteness processes work by using the available data; and it may not be possible, as the paper shows, to measure empirically the quality of the predictions. Yet, these depend heavily on the assumptions made.
Revision Rules for Convex Sets of Probabilities
, 1995
"... INTRODUCTION The best understood and most highly developed theory of uncertainty is Bayesian probability. There is a large literature on its foundations and there are many different justifications of the theory; however, all of these assume that for any proposition a, the beliefs in a and :a are st ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
INTRODUCTION The best understood and most highly developed theory of uncertainty is Bayesian probability. There is a large literature on its foundations and there are many different justifications of the theory; however, all of these assume that for any proposition a, the beliefs in a and :a are strongly tied together. Without compelling justification, this assumption greatly restricts the type of information that can be satisfactorily represented, e.g., it makes it impossible to represent adequately partial information about an unknown chance distribution P such as 0:6 P(a) 0:8. The strict Bayesian requirement that an epistemic state be a single probability function seems unreasonable. A natural extension of the Bayesian theory is thus to allow sets of probability functions and to consider constraints and bounds on these, and to calculate s
Hybrid Probabilistic Logic Programs
 Journal of Logic Programming
, 2000
"... Abstract There are many applications where the precise time at which an event will occur (or has occurred) is uncertain. Temporal probabilistic logic programs (TPLPs) allow a programmer to express knowledge about such events. In this paper, we develop a model theory, fixpoint theory, and proof theor ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
Abstract There are many applications where the precise time at which an event will occur (or has occurred) is uncertain. Temporal probabilistic logic programs (TPLPs) allow a programmer to express knowledge about such events. In this paper, we develop a model theory, fixpoint theory, and proof theory for TPLPs, and show that the fixpoint theory may be used to enumerate consequences of a TPLP in a sound and complete manner. Likewise the proof theory provides a sound and complete inference system. Last, but not least, we provide complexity results for TPLPs, showing in particular, that reasonable classes of TPLPs have polynomial data complexity. 1 Introduction There are a vast number of applications where uncertainty and time are indelibly intertwined. For example, the US Postal Service (USPS) as well as most commercial shippers have detailed statistics on how long shipments take to reach their destinations. Likewise, we are working on a Viennese historical land deed application where the precise time at which certain properties passed from one owner to another is also highly uncertain. Historical radio carbon dating methods are yet another source of uncertainty, providing approximate information about when a piece was created. Logical reasoning in situations involving temporal uncertainty is definitely important. For example, an individual querying the USPS express mail tracking system may want to know when he can expect his package to be delivered today he may then choose to stay home during the period when the probability of delivery seems very high, and leave a note authorizing the delivery official to leave the package by the door at other times.
Interpretations of belief functions in the theory of rough sets
 Information Sciences
, 1998
"... This paper reviews and examines interpretations of belief functions in the theory of rough sets with finite universe. The concept of standard rough set algebras is generalized in two directions. One is based on the use of nonequivalence relations. The other is based on relations over two universes, ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
This paper reviews and examines interpretations of belief functions in the theory of rough sets with finite universe. The concept of standard rough set algebras is generalized in two directions. One is based on the use of nonequivalence relations. The other is based on relations over two universes, which leads to the notion of interval algebras. Pawlak rough set algebras may be used to interpret belief functions whose focal elements form a partition of the universe. Generalized rough set algebras using nonequivalence relations may be used to interpret belief functions which have less than U  focal elements, where U  is the cardinality of the universe U on which belief functions are defined. Interval algebras may be used to interpret any belief functions. 1.
Sleeping Beauty Reconsidered: Conditioning and Reflection
 in Asynchronous Systems”, Proceedings of the Twentieth Conference on Uncertainty in AI, AUAI
, 2004
"... A careful analysis of conditioning in the Sleeping Beauty problem is done, using the formal model for reasoning about knowledge and probability developed by Halpern and Tuttle. While the Sleeping Beauty problem has been viewed as revealing problems with conditioning in the presence of imperfect reca ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
(Show Context)
A careful analysis of conditioning in the Sleeping Beauty problem is done, using the formal model for reasoning about knowledge and probability developed by Halpern and Tuttle. While the Sleeping Beauty problem has been viewed as revealing problems with conditioning in the presence of imperfect recall, the analysis done here reveals that the problems are not so much due to imperfect recall as to asynchrony. The implications of this analysis for van Fraassen’s Reflection Principle and Savage’s SureThing Principle are considered. 1
THE SHAPE OF INCOMPLETE PREFERENCES
, 2006
"... Incomplete preferences provide the epistemic foundation for models of imprecise subjective probabilities and utilities that are used in robust Bayesian analysis and in theories of bounded rationality. This paper presents a simple axiomatization of incomplete preferences and characterizes the shape o ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Incomplete preferences provide the epistemic foundation for models of imprecise subjective probabilities and utilities that are used in robust Bayesian analysis and in theories of bounded rationality. This paper presents a simple axiomatization of incomplete preferences and characterizes the shape of their representing sets of probabilities and utilities. Deletion of the completeness assumption from the axiom system of Anscombe and Aumann yields preferences represented by a convex set of statedependent expected utilities, of which at least one must be a probability/utility pair. A strengthening of the stateindependence axiom is needed to obtain a representation purely in terms of a set of probability/utility pairs.