Results 1  10
of
34
Conservative inference rule for uncertain reasoning under incompleteness
 Journal of Artificial Intelligence Research
"... In this paper we formulate the problem of inference under incomplete information in very general terms. This includes modelling the process responsible for the incompleteness, which we call the incompleteness process. We allow the process ’ behaviour to be partly unknown. Then we use Walley’s theory ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
In this paper we formulate the problem of inference under incomplete information in very general terms. This includes modelling the process responsible for the incompleteness, which we call the incompleteness process. We allow the process ’ behaviour to be partly unknown. Then we use Walley’s theory of coherent lower previsions, a generalisation of the Bayesian theory to imprecision, to derive the rule to update beliefs under incompleteness that logically follows from our assumptions, and that we call conservative inference rule. This rule has some remarkable properties: it is an abstract rule to update beliefs that can be applied in any situation or domain; it gives us the opportunity to be neither too optimistic nor too pessimistic about the incompleteness process, which is a necessary condition to draw reliable while strong enough conclusions; and it is a coherent rule, in the sense that it cannot lead to inconsistencies. We give examples to show how the new rule can be applied in expert systems, in parametric statistical inference, and in pattern classification, and discuss more generally the view of incompleteness processes defended here as well as some of its consequences. 1.
Independent natural extension
 IN: IPMU 2010: PROCEEDINGS OF THE 13TH INFORMATION PROCESSING AND MANAGEMENT OF UNCERTAINTY IN KNOWLEDGEBASED SYSTEMS CONFERENCE
, 2010
"... We introduce a general definition for the independence of a number of finitevalued variables, based on coherent lower previsions. Our definition has an epistemic flavour: it arises from personal judgements that a number of variables are irrelevant to one another. We show that a number of already ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
We introduce a general definition for the independence of a number of finitevalued variables, based on coherent lower previsions. Our definition has an epistemic flavour: it arises from personal judgements that a number of variables are irrelevant to one another. We show that a number of already existing notions, such as strong independence, satisfy our definition. Moreover, there always is a leastcommittal independent model, for which we provide an explicit formula: the independent natural extension. Our central result is that the independent natural extension satisfies socalled marginalisation, associativity and strong factorisation properties. These allow us to relate our research to more traditional ways of defining independence based on factorisation.
Epistemic irrelevance in credal nets: the case of imprecise Markov trees
, 2010
"... We focus on credal nets, which are graphical models that generalise Bayesian nets to imprecise probability. We replace the notion of strong independence commonly used in credal nets with the weaker notion of epistemic irrelevance, which is arguably more suited for a behavioural theory of probability ..."
Abstract

Cited by 16 (11 self)
 Add to MetaCart
We focus on credal nets, which are graphical models that generalise Bayesian nets to imprecise probability. We replace the notion of strong independence commonly used in credal nets with the weaker notion of epistemic irrelevance, which is arguably more suited for a behavioural theory of probability. Focusing on directed trees, we show how to combine the given local uncertainty models in the nodes of the graph into a global model, and we use this to construct and justify an exact messagepassing algorithm that computes updated beliefs for a variable in the tree. The algorithm, which is linear in the number of nodes, is formulated entirely in terms of coherent lower previsions, and is shown to satisfy a number of rationality requirements. We supply examples of the algorithm’s operation, and report an application to online character recognition that illustrates the advantages of our approach for prediction. We comment on the perspectives, opened by the availability, for the first time, of a truly efficient algorithm based on epistemic irrelevance.
Updating coherent previsions on finite spaces
 Fuzzy Sets and Systems
"... Abstract. We compare the different notions of conditional coherence within the behavioural theory of imprecise probabilities when all the spaces are finite. We show that the differences between the notions are due to conditioning on sets of (lower, and in some cases upper) probability zero. Next, we ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We compare the different notions of conditional coherence within the behavioural theory of imprecise probabilities when all the spaces are finite. We show that the differences between the notions are due to conditioning on sets of (lower, and in some cases upper) probability zero. Next, we characterise the range of coherent extensions in the finite case, proving that the greatest coherent extensions can always be calculated using the notion of regular extension, and we discuss the extensions of our results to infinite spaces.
Conglomerable Natural Extension
 7TH INTERNATIONAL SYMPOSIUM ON IMPRECISE PROBABILITY: THEORIES AND APPLICATIONS, INNSBRUCK, AUSTRIA
, 2011
"... We study the weakest conglomerable model that is implied by desirability or probability assessments: the conglomerable natural extension. We show that taking the natural extension of the assessments while imposing conglomerability—the procedure adopted in Walley’s theory—does not yield, in general, ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
We study the weakest conglomerable model that is implied by desirability or probability assessments: the conglomerable natural extension. We show that taking the natural extension of the assessments while imposing conglomerability—the procedure adopted in Walley’s theory—does not yield, in general, the conglomerable natural extension (but it does so in the case of the marginal extension). Iterating this process produces a sequence of models that approach the conglomerable natural extension, although it is not known, at this point, whether it is attained in the limit. We give sufficient conditions for this to happen in some special cases, and study the differences between working with coherent sets of desirable gambles and coherent lower previsions. Our results indicate that it might be necessary to rethink the foundations of Walley’s theory of coherent conditional lower previsions for infinite partitions of conditioning events.
ON THE CONNECTION BETWEEN PROBABILITY BOXES AND POSSIBILITY MEASURES
"... ABSTRACT. We explore the relationship between possibility measures (supremum preserving normed measures) and pboxes (pairs of cumulative distribution functions) on totally preordered spaces, extending earlier work in this direction by De Cooman and Aeyels, among others. We start by demonstrating th ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
ABSTRACT. We explore the relationship between possibility measures (supremum preserving normed measures) and pboxes (pairs of cumulative distribution functions) on totally preordered spaces, extending earlier work in this direction by De Cooman and Aeyels, among others. We start by demonstrating that only those pboxes who have 0–1valued lower or upper cumulative distribution function can be possibility measures, and we derive expressions for their natural extension in this case. Next, we establish necessary and sufficient conditions for a pbox to be a possibility measure. Finally, we show that almost every possibility measure can be modelled by a pbox, simply by ordering elements by increasing possibility. Whence, any techniques for pboxes can be readily applied to possibility measures. We demonstrate this by deriving joint possibility measures from marginals, under varying assumptions of independence, using a technique known for pboxes. Doing so, we arrive at a new rule of combination for possibility measures, for the independent case. 1.
Probability boxes on totally preordered spaces for multivariate modelling
 International Journal of Approximate Reasoning
"... ar ..."
(Show Context)
NOTES ON DESIRABILITY AND CONDITIONAL LOWER PREVISIONS
"... Abstract. We detail the relationship between sets of desirable gambles and conditional lower previsions. The former is one the most general models of uncertainty. The latter corresponds to Walley’s celebrated theory of imprecise probability. We consider two avenues: when a collection of conditional ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We detail the relationship between sets of desirable gambles and conditional lower previsions. The former is one the most general models of uncertainty. The latter corresponds to Walley’s celebrated theory of imprecise probability. We consider two avenues: when a collection of conditional lower previsions is derived from a set of desirable gambles, and its converse. In either case, we relate the properties of the derived model with those of the originating one. Our results constitute basic tools to move from one formalism to the other, and thus to take advantage of work done in the two fronts. 1.
CONDITIONAL MODELS: COHERENCE AND INFERENCE THROUGH SEQUENCES OF JOINT MASS FUNCTIONS
"... Abstract. We call a conditional model any set of statements made of conditional probabilities or expectations. We take conditional models as primitive compared to unconditional probability, in the sense that conditional statements do not need to be derived from an unconditional probability. We focus ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We call a conditional model any set of statements made of conditional probabilities or expectations. We take conditional models as primitive compared to unconditional probability, in the sense that conditional statements do not need to be derived from an unconditional probability. We focus on two problems: (coherence) giving conditions to guarantee that a conditional model is selfconsistent; (inference) delivering methods to derive new probabilistic statements from a selfconsistent conditional model. We address these problems in the case where the probabilistic statements can be specified imprecisely through sets of probabilities, while restricting the attention to finite spaces of possibilities. Using Walley’s theory of coherent lower previsions, we fully characterise the question of coherence, and specialise it for the case of precisely specified probabilities, which is the most common case addressed in the literature. This shows that coherent conditional models are equivalent to sequences of (possibly sets of) unconditional mass functions. In turn, this implies that the inferences from a conditional model are the limits of the conditional inferences obtained by applying Bayes ’ rule, when possible, to the elements of the sequence. In doing so, we unveil the tight connection between conditional models and zeroprobability events. 1.