Results 1 
4 of
4
Probabilistic Horn abduction and Bayesian networks
 Artificial Intelligence
, 1993
"... This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. The framework incorporates assumptions about the rule base and independence assumptions amongst hypotheses. It is shown how any probabilistic knowledge representable in a discrete Bayesia ..."
Abstract

Cited by 298 (37 self)
 Add to MetaCart
This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. The framework incorporates assumptions about the rule base and independence assumptions amongst hypotheses. It is shown how any probabilistic knowledge representable in a discrete Bayesian belief network can be represented in this framework. The main contribution is in finding a relationship between logical and probabilistic notions of evidential reasoning. This provides a useful representation language in its own right, providing a compromise between heuristic and epistemic adequacy. It also shows how Bayesian networks can be extended beyond a propositional language. This paper also shows how a language with only (unconditionally) independent hypotheses can represent any probabilistic knowledge, and argues that it is better to invent new hypotheses to explain dependence rather than having to worry about dependence in the language. Scholar, Canadian Institute for Advanced...
Representing Diagnosis Knowledge
 Annals of Mathematics and Artificial Intelligence
, 1994
"... This paper considers the representation problem: namely how to go from an abstract problem to a formal representation of the problem. We consider this for two conceptions of logicbased diagnosis, namely abductive and consistencybased diagnosis. We show how to represent diagnostic problems that can ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
This paper considers the representation problem: namely how to go from an abstract problem to a formal representation of the problem. We consider this for two conceptions of logicbased diagnosis, namely abductive and consistencybased diagnosis. We show how to represent diagnostic problems that can be conceptualised causally in each of the frameworks, and show that both representations of the same problems give the same answers. This is a local transformation that allows for an expressive (albeit propositional) language for giving the constraints on what symptoms and causes can coexist, including nonstrict causation. This nonstrict causation can be represented in each framework without adding special reasoning constructs to either framework. This is presented as a starting point for a study of the representation problem in diagnosis, rather than as an end in itself. 1 Introduction This paper defines an abstract "knowledge representation" problem and considers the problem of represe...
Representing Bayesian networks within probabilistic Horn abduction
 In Proc. Seventh Conf. on Uncertainty in Artificial Intelligence
, 1991
"... This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logic ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
This paper presents a simple framework for Hornclause abduction, with probabilities associated with hypotheses. It is shown how this representation can represent any probabilistic knowledge representable in a Bayesian belief network. The main contributions are in finding a relationship between logical and probabilistic notions of evidential reasoning. This can be used as a basis for a new way to implement Bayesian Networks that allows for approximations to the value of the posterior probabilities, and also points to a way that Bayesian networks can be extended beyond a propositional language. 1
Learning, Bayesian Probability, Graphical Models, and Abduction
 Abduction and Induction: Essays on their Relation and Integration, Chapter 10
, 1998
"... In this chapter I review Bayesian statistics as used for induction and relate it to logicbased abduction. Much reasoning under uncertainty, including induction, is based on Bayes' rule. Bayes' rule is interesting precisely because it provides a mechanism for abduction. I review work of Buntine that ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
In this chapter I review Bayesian statistics as used for induction and relate it to logicbased abduction. Much reasoning under uncertainty, including induction, is based on Bayes' rule. Bayes' rule is interesting precisely because it provides a mechanism for abduction. I review work of Buntine that argues that much of the work on Bayesian learning can be best viewed in terms of graphical models such as Bayesian networks, and review previous work of Poole that relates Bayesian networks to logicbased abduction. This lets us see how much of the work on induction can be viewed in terms of logicbased abduction. I then explore what this means for extending logicbased abduction to richer representations, such as learning decision trees with probabilities at the leaves. Much of this paper is tutorial in nature; both the probabilistic and logicbased notions of abduction and induction are introduced and motivated. 1 Introduction This paper explores the relationship between learning (induct...