Results 1  10
of
23
Abduction in Logic Programming
"... Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over th ..."
Abstract

Cited by 616 (76 self)
 Add to MetaCart
(Show Context)
Abduction in Logic Programming started in the late 80s, early 90s, in an attempt to extend logic programming into a framework suitable for a variety of problems in Artificial Intelligence and other areas of Computer Science. This paper aims to chart out the main developments of the field over the last ten years and to take a critical view of these developments from several perspectives: logical, epistemological, computational and suitability to application. The paper attempts to expose some of the challenges and prospects for the further development of the field.
Lifted firstorder probabilistic inference
 In Proceedings of IJCAI05, 19th International Joint Conference on Artificial Intelligence
, 2005
"... Most probabilistic inference algorithms are specified and processed on a propositional level. In the last decade, many proposals for algorithms accepting firstorder specifications have been presented, but in the inference stage they still operate on a mostly propositional representation level. [Poo ..."
Abstract

Cited by 125 (8 self)
 Add to MetaCart
Most probabilistic inference algorithms are specified and processed on a propositional level. In the last decade, many proposals for algorithms accepting firstorder specifications have been presented, but in the inference stage they still operate on a mostly propositional representation level. [Poole, 2003] presented a method to perform inference directly on the firstorder level, but this method is limited to special cases. In this paper we present the first exact inference algorithm that operates directly on a firstorder level, and that can be applied to any firstorder model (specified in a language that generalizes undirected graphical models). Our experiments show superior performance in comparison with propositional exact inference. 1
Minibuckets: A general scheme for bounded inference
 Journal of the ACM (JACM
"... Abstract. This article presents a class of approximation algorithms that extend the idea of boundedcomplexity inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies create ..."
Abstract

Cited by 72 (24 self)
 Add to MetaCart
Abstract. This article presents a class of approximation algorithms that extend the idea of boundedcomplexity inference, inspired by successful constraint propagation algorithms, to probabilistic inference and combinatorial optimization. The idea is to bound the dimensionality of dependencies created by inference algorithms. This yields a parameterized scheme, called minibuckets, that offers adjustable tradeoff between accuracy and efficiency. The minibucket approach to optimization problems, such as finding the most probable explanation (MPE) in Bayesian networks, generates both an approximate solution and bounds on the solution quality. We present empirical results demonstrating successful performance of the proposed approximation scheme for the MPE task, both on randomly generated problems and on realistic domains such as medical diagnosis and probabilistic decoding.
Value Elimination: Bayesian Inference via Backtracking Search
 IN UAI03
, 2003
"... We present Value Elimination, a new algorithm for Bayesian Inference. Given the same variable ordering information, Value Elimination can achieve performance that is within a constant factor of variable elimination or recursive conditioning, and on some problems it can perform exponentially bet ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
(Show Context)
We present Value Elimination, a new algorithm for Bayesian Inference. Given the same variable ordering information, Value Elimination can achieve performance that is within a constant factor of variable elimination or recursive conditioning, and on some problems it can perform exponentially better, irrespective of the variable ordering used by these algorithms. Value Elimination
A Survey of Algorithms for RealTime Bayesian Network Inference
 In In the joint AAAI02/KDD02/UAI02 workshop on RealTime Decision Support and Diagnosis Systems
, 2002
"... As Bayesian networks are applied to more complex and realistic realworld applications, the development of more efficient inference algorithms working under realtime constraints is becoming more and more important. This paper presents a survey of various exact and approximate Bayesian network ..."
Abstract

Cited by 48 (2 self)
 Add to MetaCart
As Bayesian networks are applied to more complex and realistic realworld applications, the development of more efficient inference algorithms working under realtime constraints is becoming more and more important. This paper presents a survey of various exact and approximate Bayesian network inference algorithms. In particular, previous research on realtime inference is reviewed. It provides a framework for understanding these algorithms and the relationships between them. Some important issues in realtime Bayesian networks inference are also discussed.
The Independent Choice Logic and Beyond
"... Abstract. The Independent Choice Logic began in the early 90’s as a way to combine logic programming and probability into a coherent framework. The idea of the Independent Choice Logic is straightforward: there is a set of independent choices with a probability distribution over each choice, and a l ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
(Show Context)
Abstract. The Independent Choice Logic began in the early 90’s as a way to combine logic programming and probability into a coherent framework. The idea of the Independent Choice Logic is straightforward: there is a set of independent choices with a probability distribution over each choice, and a logic program that gives the consequences of the choices. There is a measure over possible worlds that is defined by the probabilities of the independent choices, and what is true in each possible world is given by choices made in that world and the logic program. ICL is interesting because it is a simple, natural and expressive representation of rich probabilistic models. This paper gives an overview of the work done over the last decade and half, and points towards the considerable work ahead, particularly in the areas of lifted inference and the problems of existence and identity. 1
Contextspecific approximation in probabilistic inference
 In: Proc. of Uncertainty in Artificial Intelligence (UAI
, 1998
"... There is evidence that the numbers in probabilistic inference don’t really matter. This paper considers the idea that we can make a probabilistic model simpler by making fewer distinctions. Unfortunately, the level of a Bayesian network seems too coarse; it is unlikely that a parent will make little ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
There is evidence that the numbers in probabilistic inference don’t really matter. This paper considers the idea that we can make a probabilistic model simpler by making fewer distinctions. Unfortunately, the level of a Bayesian network seems too coarse; it is unlikely that a parent will make little difference for all values of the other parents. In this paper we consider an approximation scheme where distinctions can be ignored in some contexts, but not in other contexts. We elaborate on a notion of a parent context that allows a structured contextspecific decomposition of a probability distribution and the associated probabilistic inference scheme called probabilistic partial evaluation (Poole 1997). This paper shows a way to simplify a probabilistic model by ignoring distinctions which have similar probabilities, a method to exploit the simpler model, a bound on the resulting errors, and some preliminary empirical results on simple networks. 1
Learning, Bayesian Probability, Graphical Models, and Abduction
 Abduction and Induction: Essays on their Relation and Integration, Chapter 10
, 1998
"... In this chapter I review Bayesian statistics as used for induction and relate it to logicbased abduction. Much reasoning under uncertainty, including induction, is based on Bayes' rule. Bayes' rule is interesting precisely because it provides a mechanism for abduction. I review work of Bu ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
In this chapter I review Bayesian statistics as used for induction and relate it to logicbased abduction. Much reasoning under uncertainty, including induction, is based on Bayes' rule. Bayes' rule is interesting precisely because it provides a mechanism for abduction. I review work of Buntine that argues that much of the work on Bayesian learning can be best viewed in terms of graphical models such as Bayesian networks, and review previous work of Poole that relates Bayesian networks to logicbased abduction. This lets us see how much of the work on induction can be viewed in terms of logicbased abduction. I then explore what this means for extending logicbased abduction to richer representations, such as learning decision trees with probabilities at the leaves. Much of this paper is tutorial in nature; both the probabilistic and logicbased notions of abduction and induction are introduced and motivated. 1 Introduction This paper explores the relationship between learning (induct...
Logical argumentation, abduction and Bayesian decision theory: a Bayesian approach to logical arguments and its application to legal evidential reasoning
 Cardozo Law Review
"... There are good normative arguments for using Bayesian decision theory for deciding what to do. However, there are also good arguments for using logic, where we want have a formal semantics for a language and use the structure of logical argumentation with logical variables to represent multiple indi ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
There are good normative arguments for using Bayesian decision theory for deciding what to do. However, there are also good arguments for using logic, where we want have a formal semantics for a language and use the structure of logical argumentation with logical variables to represent multiple individuals (things). This paper shows how decision theory and logical argumentation can be combined into a coherent framework. The Independent Choice Logic can be viewed as firstorder representation of belief networks with conditional probability tables represented as firstorder rules, or as a abductive/argumentbased logic with probabilities over assumables. Intuitively we can use logic to model causally (in terms of logic programs with assumables). Given evidence, we abduce to the explanations, and then can predict what follows from these explanations. As well as abduction to the best explanation(s), from which we can bound probabilities, we can also do marginalization to reduce the detail of arguments. An example of Tillers is given is used to show the how the framework could be used for legal reasoning. The code to run this example is available from the authors web site. 1 1