Results 1 
5 of
5
Soft Evidential Update for Probabilistic Multiagent Systems
 INTERNATIONAL JOURNAL OF APPROXIMATE REASONING
, 2000
"... We address the problem of updating a probability distribution represented by a Bayesian network upon presentation of soft evidence. Our motivation ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
We address the problem of updating a probability distribution represented by a Bayesian network upon presentation of soft evidence. Our motivation
A New Criterion for Comparing Fuzzy Logics for Uncertain Reasoning
, 1996
"... A new criterion is introduced for judging the suitability of various `fuzzy logics' for practical uncertain reasoning in a probabilistic world and the relationship of this criterion to several established criteria, and its consequences for truth functional belief, are investigated. Introduction It ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A new criterion is introduced for judging the suitability of various `fuzzy logics' for practical uncertain reasoning in a probabilistic world and the relationship of this criterion to several established criteria, and its consequences for truth functional belief, are investigated. Introduction It is a rather widespread assumption in uncertain reasoning, and one that we shall make for the purpose of this paper, that a piece of uncertain knowledge can be adequately captured by attaching a real number (signifying the degree of uncertainty) on some scale to some unequivocal statement or conditional, and that an intelligent agent's knowledge base consists of a large, but nevertheless nite, set K of such expressions. Whether or not this is the correct picture for animate intelligent agents such as ourselves is, perhaps, questionable, but it is certainly the case that many expert systems (which one might feel should be included under the vague title of `intelligent agent') have, by design...
On the Emergence of Reasons in Inductive Logic
 Journal of the IGPL
, 2001
"... We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating nite predicate knowledge bases as potentially in nite propositional knowledge bases. It is shown that for a range of predicate knowledg ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating nite predicate knowledge bases as potentially in nite propositional knowledge bases. It is shown that for a range of predicate knowledge bases (such as those typically associated with inductive reasoning) and several key propositional inference processes (in particular the Maximum Entropy Inference Process) this procedure is well de ned, and furthermore yields an explanation for the validity of the induction in terms of `reasons'. Keywords: Inductive Logic, Probabilistic Reasoning, Abduction, Maximum Entropy, Uncertain Reasoning. 1 Motivation Consider the following situation. I am sitting by a bend in a road and I start to wonder how likely it is that the next car which passes will skid on this bend. I have some knowledge which seems relevant, for example I know that if there is ice on the road then there is a good chance of a skid, and similarly if the bend is unsigned, the camber adverse, etc.. I possibly also have some knowledge of how likely it is that there is ice on the road, how likely it is that the bend is unsigned (possibly conditioned on the iciness of the road) etc.. Notice that this is generic knowledge which applies equally to any potential passing car.
On the Distribution of Natural Probability Functions
, 1999
"... The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of `natural' probability functions, more precisely the probability functions on f0; 1g n which we encounter naturally in the re ..."
Abstract
 Add to MetaCart
The purpose of this note is to describe the underlying insights and results obtained by the authors, and others, in a series of papers aimed at modelling the distribution of `natural' probability functions, more precisely the probability functions on f0; 1g n which we encounter naturally in the real world as subjects for statistical inference, by identifying such functions with large, random, sentences of the propositional calculus. We explain how this approach produces a robust parameterised family of priors, Jn , with several of the properties we might have hoped for in the context, for example marginalisation, invariance under (weak) renaming, and an emphasis on multivariate probability functions exhibiting high interdependence between features. Keywords. Prior probability, imprecise probability, random rentences, probabilistic reasoning, uncertain reasoning. 1 Motivation The motivation for the research described in this paper can, at least partly, be traced back to our experie...
Some Limit Theorems for ME, MD and ...
"... We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating nite predicate knowledge bases as potentially innite propositional knowledge bases. Full and detailed proofs are given to show that for a ..."
Abstract
 Add to MetaCart
We apply methods of abduction derived from propositional probabilistic reasoning to predicate probabilistic reasoning, in particular inductive logic, by treating nite predicate knowledge bases as potentially innite propositional knowledge bases. Full and detailed proofs are given to show that for a range of predicate knowledge bases (such as those typically associated with inductive reasoning) and several key propositional inference processes (in particular the Maximum Entropy Inference Process) this procedure is well dened, and furthermore yields an explanation for the validity of the induction in terms of `reasons'. Motivation Consider the following situation. I am sitting by a bend in a road and I start to wonder how likely it is that the next car which passes will skid on this bend. I have some knowledge which seems relevant, for example I know that if there is ice on the road then there is a good chance of a skid, and similarly if the bend is unsigned, the camber adverse, etc.. I possibly also have some knowledge of how likely it is that there is ice on the road, how likely it is that the bend is unsigned (possibly conditioned on the iciness of the road) etc.. Notice that this is generic knowledge which applies equally to any potential passing car. Supported by a EPRSC Research Associateship y Supported by an Egyptian Government Scholarship, File No. 7083 1 Armed with this knowledge base I may now form some opinion as to the likely outcome when the next car passes. Subsequently several cars pass by. I note the results and in consequence possibly revise my opinion as to the likelihood of the next car through skidding. Clearly we are all capable of forming opinions, or beliefs, in this way, but is it possible to formalize this inductive process, this pro...