Results 1  10
of
29
Lifted Aggregation in Directed Firstorder Probabilistic Models
"... As exact inference for firstorder probabilistic graphical models at the propositional level can be formidably expensive, there is an ongoing effort to design efficient lifted inference algorithms for such models. This paper discusses directed firstorder models that require an aggregation operator ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
As exact inference for firstorder probabilistic graphical models at the propositional level can be formidably expensive, there is an ongoing effort to design efficient lifted inference algorithms for such models. This paper discusses directed firstorder models that require an aggregation operator when a parent random variable is parameterized by logical variables that are not present in a child random variable. We introduce a new data structure, aggregation parfactors, to describe aggregation in directed firstorder models. We show how to extend Milch et al.’s CFOVE algorithm to perform lifted inference in the presence of aggregation parfactors. We also show that there are cases where the polynomial time complexity (in the domain size of logical variables) of the CFOVE algorithm can be reduced to logarithmic time complexity using aggregation parfactors. 1
ILP turns 20  Biography and future challenges
 MACH LEARN
, 2011
"... Inductive Logic Programming (ILP) is an area of Machine Learning which has now reached its twentieth year. Using the analogy of a human biography this paper recalls the development of the subject from its infancy through childhood and teenage years. We show how in each phase ILP has been characteri ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
Inductive Logic Programming (ILP) is an area of Machine Learning which has now reached its twentieth year. Using the analogy of a human biography this paper recalls the development of the subject from its infancy through childhood and teenage years. We show how in each phase ILP has been characterised by an attempt to extend theory and implementations in tandem with the development of novel and challenging realworld applications. Lastly, by projection we suggest directions for research which will help the subject coming of age.
LogLinear Description Logics
 PROCEEDINGS OF THE TWENTYSECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE
, 2011
"... Loglinear description logics are a family of probabilistic logics integrating various concepts and methods from the areas of knowledge representation and reasoning and statistical relational AI. We define the syntax and semantics of loglinear description logics, describe a convenient representatio ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
Loglinear description logics are a family of probabilistic logics integrating various concepts and methods from the areas of knowledge representation and reasoning and statistical relational AI. We define the syntax and semantics of loglinear description logics, describe a convenient representation as sets of firstorder formulas, and discuss computational and algorithmic aspects of probabilistic queries in the language. The paper concludes with an experimental evaluation of an implementation of a loglinear DL reasoner.
Belief logic programming: Uncertainty reasoning with correlation of evidence
 In Intl. Conf. on Logic Programming and Nonmonotonic Reasoning (LPNMR
, 2009
"... Abstract. Belief Logic Programming (BLP) is a novel form of quantitative logic programming in the presence of uncertain and inconsistent information, which was designed to be able to combine and correlate evidence obtained from nonindependent information sources. BLP has nonmonotonic semantics bas ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Belief Logic Programming (BLP) is a novel form of quantitative logic programming in the presence of uncertain and inconsistent information, which was designed to be able to combine and correlate evidence obtained from nonindependent information sources. BLP has nonmonotonic semantics based on the concepts of belief combination functions and is inspired by DempsterShafer theory of evidence. Most importantly, unlike the previous efforts to integrate uncertainty and logic programming, BLP can correlate structural information contained in rules and provides more accurate certainty estimates. The results are illustrated via simple, yet realistic examples of rulebased Web service integration. 1
Query Answering in Belief Logic Programming ⋆
"... Abstract. In this paper we introduce a fixpoint semantics for quantitative logic programming, which is able to both combine and correlate evidence from different sources of information. Based on this semantics, we develop efficient algorithms that can answer queries for nonground programs with the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. In this paper we introduce a fixpoint semantics for quantitative logic programming, which is able to both combine and correlate evidence from different sources of information. Based on this semantics, we develop efficient algorithms that can answer queries for nonground programs with the help of an SLDlike procedure. We also analyze the computational complexity of the algorithms and illustrate their uses. 1
Structured Probabilistic Modelling for Dialogue Management
, 2014
"... reproduced or transmitted, in any form or by any means, without permission. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
reproduced or transmitted, in any form or by any means, without permission.
Belief Logic Programming and its Extensions ∗
, 2009
"... Belief Logic Programming (BLP) is a novel form of quantitative logic programming in the presence of uncertain and inconsistent information, which was designed to be able to combine and correlate evidence obtained from nonindependent information sources. BLP has nonmonotonic semantics based on the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Belief Logic Programming (BLP) is a novel form of quantitative logic programming in the presence of uncertain and inconsistent information, which was designed to be able to combine and correlate evidence obtained from nonindependent information sources. BLP has nonmonotonic semantics based on the concepts of belief combination functions and DempsterShafer theory of evidence. Most importantly, unlike the previous efforts to integrate uncertainty and logic programming, BLP can correlate structural information contained in rules and provides more accurate certainty estimates. Declarative semantics is provided as well as query evaluation algorithms. Also BLP is extended to to programs with cycles and to correlated base facts. The results are illustrated via simple, yet realistic examples of rulebased Web service integration.
Parameter learning in prism programs with continuous random variables. arXiv preprint arXiv:1203.4287
, 2012
"... ar ..."
Inference in probabilistic logic programs with continuous random variables, Theory Pract
 Log. Program
"... ar ..."
Model Checking with Probabilistic Tabled Logic Programming∗
"... We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The infe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The inference algorithms of existing probabilistic logicprogramming systems are well defined only for queries with a finite number of explanations. This restriction prohibits the encoding of probabilistic model checkers, where explanations correspond to executions of the system being model checked. To overcome this restriction, we propose a more general inference algorithm that uses finite generative structures (similar to automata) to represent families of explanations. The inference algorithm computes the probability of a possibly infinite set of explanations directly from the finite generative structure. We have implemented our inference algorithm in XSB Prolog, and use this implementation to encode probabilistic model checkers for a variety of temporal logics, including PCTL and GPL (which subsumes PCTL∗). Our experiment results show that, despite the highly declarative nature of their encodings, the model checkers constructed in this manner are competitive with their native implementations. 1