Results 11 - 20
of
33
A Probabilistic Logic of Qualitative Time
"... The following full text is a preprint version which may differ from the publisher's version. For additional information about this publication click this link. ..."
Abstract
- Add to MetaCart
(Show Context)
The following full text is a preprint version which may differ from the publisher's version. For additional information about this publication click this link.
Reasoning With Different Time Granularities in Industrial Applications: A Case Study Using CP-logic
"... The following full text is a preprint version which may differ from the publisher's version. For additional information about this publication click this link. ..."
Abstract
- Add to MetaCart
(Show Context)
The following full text is a preprint version which may differ from the publisher's version. For additional information about this publication click this link.
T P -Compilation for Inference in Probabilistic Logic Programs
"... Abstract We propose T P -compilation, a new inference technique for probabilistic logic programs that is based on forward reasoning. T P -compilation proceeds incrementally in that it interleaves the knowledge compilation step for weighted model counting with forward reasoning on the logic program. ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract We propose T P -compilation, a new inference technique for probabilistic logic programs that is based on forward reasoning. T P -compilation proceeds incrementally in that it interleaves the knowledge compilation step for weighted model counting with forward reasoning on the logic program. This leads to a novel anytime algorithm that provides hard bounds on the inferred probabilities. The main difference with existing inference techniques for probabilistic logic programs is that these are a sequence of isolated transformations. Typically, these transformations include conversion of the ground program into an equivalent propositional formula and compilation of this formula into a more tractable target representation for weighted model counting. An empirical evaluation shows that T P -compilation effectively handles larger instances of complex or cyclic real-world problems than current sequential approaches, both for exact and anytime approximate inference. Furthermore, we show that T P -compilation is conducive to inference in dynamic domains as it supports efficient updates to the compiled model.
cplint and PITA
"... programs [9, 10]. They share the following syntax for input programs. 1 Syntax Disjunction in the head is represented with a semicolon and atoms in the head are separated from probabilities by a colon. For the rest, the usual syntax of Prolog is used. For example, the LPAD clause h1: p1 ∨... ∨ hn: p ..."
Abstract
- Add to MetaCart
(Show Context)
programs [9, 10]. They share the following syntax for input programs. 1 Syntax Disjunction in the head is represented with a semicolon and atoms in the head are separated from probabilities by a colon. For the rest, the usual syntax of Prolog is used. For example, the LPAD clause h1: p1 ∨... ∨ hn: pn ← b1,..., bm, ¬c1,..., ¬cl is represented by h1:p1;...; hn:pn:- b1,...,bm,\+ c1,....,\+ cl No parentheses are necessary. The pi are numeric expressions. It is up to the user to ensure that the numeric expressions are legal, i.e. that they sum up to less than one. If the clause has a single head with probability 1, the annotation can be omitted and the clause takes the form of a normal prolog clause. The coin example of [12] is represented as heads(Coin):1/2; tails(Coin):1/2:heads(Coin):0.6; tails(Coin):0.4:fair(Coin):0.9; biased(Coin):0.1. toss(coin). toss(Coin),\+biased(Coin). toss(Coin),biased(Coin). 2 cplint cplint consists of three Prolog modules for answering queries using goal-oriented procedures. lpadsld.pl: computes the probability of a query using the top-down procedure described in in [5] and [6]. It is based on SLDNF resolution and is an adaptation of the interpreter for ProbLog [2]. lpad.pl: computes the probability of a query using a top-down procedure based on SLG resolution [1]. As a consequence, it works for any sound LPADs, i.e., any LPAD such that each of its instances has a two valued well founded model. cpl.pl: computes the probability of a query using a top-down procedure based on SLG resolution and moreover checks that the CP-logic program is valid, i.e., that it has at least an execution model. 2.1 Installation cplint is distributed in source code in the git version of Yap. It includes Prolog and C files. Download it by following the instruction in
Inference for a New Probabilistic Constraint Logic ∗
"... Probabilistic logics combine the expressive power of logic with the ability to reason with uncertainty. Several probabilistic logic languages have been proposed in the past, each of them with their own features. In this paper, we propose a new probabilistic constraint logic programming language, whi ..."
Abstract
- Add to MetaCart
Probabilistic logics combine the expressive power of logic with the ability to reason with uncertainty. Several probabilistic logic languages have been proposed in the past, each of them with their own features. In this paper, we propose a new probabilistic constraint logic programming language, which combines constraint logic programming with probabilistic reasoning. The language supports modeling of discrete as well as continuous probability distributions by expressing constraints on random variables. We introduce the declarative semantics of this language, present an exact inference algorithm to derive bounds on the joint probability distributions consistent with the specified constraints, and give experimental results. The results obtained are encouraging, indicating that inference in our language is feasible for solving challenging problems. 1
EM over Binary Decision Diagrams for Probabilistic Logic Programs
"... Abstract. Recently much work in Machine Learning has concentrated on representation languages able to combine aspects of logic and probability, leading to the birth of a whole field called Statistical Relational Learning. In this paper we present a technique for parameter learning targeted to a fami ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. Recently much work in Machine Learning has concentrated on representation languages able to combine aspects of logic and probability, leading to the birth of a whole field called Statistical Relational Learning. In this paper we present a technique for parameter learning targeted to a family of formalisms where uncertainty is represented using Logic Programming techniques- the so-called Probabilistic Logic Programs such as ICL, PRISM, ProbLog and LPAD. Since their equivalent Bayesian networks contain hidden variables, an EM algorithm is adopted. In order to speed the computation, expectations are computed directly on the Binary Decision Diagrams that are built for inference. The resulting system, called EMBLEM for “EM over Bdds for probabilistic Logic programs Efficient Mining”, has been applied to a number of datasets and showed good performances both in terms of speed and memory usage.
IOS Press MCINTYRE: A Monte Carlo System for Probabilistic Logic Programming
"... Abstract. Probabilistic Logic Programming is receiving an increasing attention for its ability to model domains with complex and uncertain relations among entities. In this paper we concentrate on the problem of approximate inference in probabilistic logic programming languages based on the distribu ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. Probabilistic Logic Programming is receiving an increasing attention for its ability to model domains with complex and uncertain relations among entities. In this paper we concentrate on the problem of approximate inference in probabilistic logic programming languages based on the distribution semantics. A successful approximate approach is based on Monte Carlo sampling, that consists in verifying the truth of the query in a normal program sampled from the probabilistic program. The ProbLog system includes such an algorithm and so does the cplint suite. In this paper we propose an approach for Monte Carlo inference that is based on a program transformation that translates a probabilistic program into a normal program to which the query can be posed. The current sample is stored in the internal database of the Yap Prolog engine. The resulting system, called MCINTYRE for Monte Carlo INference wiTh Yap REcord, is evaluated on various problems: biological networks, artificial datasets and a hidden Markov model. MCINTYRE is compared with the Monte Carlo algorithms of ProbLog andcplint and with the exact inference of the PITA system. The results show that MCINTYRE is faster than the other Monte Carlo systems.
Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence Sequences of Mechanisms for Causal Reasoning in Artificial Intelligence
"... We present a new approach to token-level causal reasoning that we call Sequences Of Mechanisms (SoMs), which models causality as a dynamic sequence of active mechanisms that chain together to propagate causal influence through time. We motivate this approach by using examples from AI and robotics an ..."
Abstract
- Add to MetaCart
We present a new approach to token-level causal reasoning that we call Sequences Of Mechanisms (SoMs), which models causality as a dynamic sequence of active mechanisms that chain together to propagate causal influence through time. We motivate this approach by using examples from AI and robotics and show why existing approaches are inadequate. We present an algorithm for causal reasoning based on SoMs, which takes as input a knowledge base of first-order mechanisms and a set of observations, and it hypothesizes which mechanisms are active at what time. We show empirically that our algorithm produces plausible causal explanations of simulated observations generated from a causal model. We argue that the SoMs approach is qualitatively closer to the human causal reasoning process, for example, it will only include relevant variables in explanations. We present new insights about causal reasoning that become apparent with this view. One such insight is that observation and manipulation do not commute in causal models, a fact which we show to be a generalization of the Equilibration-Manipulation Commutability of [Dash(2005)]. 1
Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence Inference for a New Probabilistic Constraint Logic ∗
"... Probabilistic logics combine the expressive power of logic with the ability to reason with uncertainty. Several probabilistic logic languages have been proposed in the past, each of them with their own features. In this paper, we propose a new probabilistic constraint logic programming language, whi ..."
Abstract
- Add to MetaCart
Probabilistic logics combine the expressive power of logic with the ability to reason with uncertainty. Several probabilistic logic languages have been proposed in the past, each of them with their own features. In this paper, we propose a new probabilistic constraint logic programming language, which combines constraint logic programming with probabilistic reasoning. The language supports modeling of discrete as well as continuous probability distributions by expressing constraints on random variables. We introduce the declarative semantics of this language, present an exact inference algorithm to derive bounds on the joint probability distributions consistent with the specified constraints, and give experimental results. The results obtained are encouraging, indicating that inference in our language is feasible for solving challenging problems. 1