Results 1 
9 of
9
Expectation Maximization over Binary Decision Diagrams for Probabilistic Logic Programs
"... Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a M ..."
Abstract

Cited by 20 (13 self)
 Add to MetaCart
(Show Context)
Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a Machine Learning technique targeted to Probabilistic Logic Programs, a family of formalisms where uncertainty is represented using Logic Programming tools. Among various proposals for Probabilistic Logic Programming, the one based on the distribution semantics is gaining popularity and is the basis for languages such as ICL, PRISM, ProbLog andLogic Programs with Annotated Disjunctions. This paper proposes a technique for learning parameters of these languages. Since their equivalent Bayesian networks contain hidden variables, an Expectation Maximization (EM) algorithm is adopted. In order to speed the computation up, expectations are computed directly on the Binary Decision Diagrams that are built for inference. The resulting system, called EMBLEM for “EM over Bdds for probabilistic Logic programs Efficient Mining”, has been applied to a number of datasets and showed good performances both in terms of speed and memory usage. In particular its speed allows the execution of a high number of restarts, resulting in good quality of the solutions.
Structure learning of probabilistic logic programs by searching the clause space
 CoRR/arXiv:1309.2080
, 2013
"... ar ..."
EM over Binary Decision Diagrams for Probabilistic Logic Programs
"... Abstract. Recently much work in Machine Learning has concentrated on representation languages able to combine aspects of logic and probability, leading to the birth of a whole field called Statistical Relational Learning. In this paper we present a technique for parameter learning targeted to a fami ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Recently much work in Machine Learning has concentrated on representation languages able to combine aspects of logic and probability, leading to the birth of a whole field called Statistical Relational Learning. In this paper we present a technique for parameter learning targeted to a family of formalisms where uncertainty is represented using Logic Programming techniques the socalled Probabilistic Logic Programs such as ICL, PRISM, ProbLog and LPAD. Since their equivalent Bayesian networks contain hidden variables, an EM algorithm is adopted. In order to speed the computation, expectations are computed directly on the Binary Decision Diagrams that are built for inference. The resulting system, called EMBLEM for “EM over Bdds for probabilistic Logic programs Efficient Mining”, has been applied to a number of datasets and showed good performances both in terms of speed and memory usage.
Trading Expressivity for Efficiency in Statistical Relational Learning
"... Statistical Relational Learning (SRL) is concerned with building statistical models for relational data. While SRL approaches have shown much potential in complex realworld application domains, their computational complexity remains a major issue and often limits their practical applicability. This ..."
Abstract
 Add to MetaCart
(Show Context)
Statistical Relational Learning (SRL) is concerned with building statistical models for relational data. While SRL approaches have shown much potential in complex realworld application domains, their computational complexity remains a major issue and often limits their practical applicability. This thesis is concerned with relatively simple yet efficient SRL techniques. We show how expressivity and generality can be traded for efficiency by restricting model complexity and developing specialpurpose inference and learning algorithms that take advantage of such restrictions, as well as by tailoring models to specific application domains. 1.
Programs from Interpretations
, 2010
"... ProbLog is a recently introduced probabilistic extension of the logic programming language Prolog, in which facts can be annotated with the probability that they hold. The advantage of this probabilistic language is that it naturally expresses a generative process over interpretations using a declar ..."
Abstract
 Add to MetaCart
(Show Context)
ProbLog is a recently introduced probabilistic extension of the logic programming language Prolog, in which facts can be annotated with the probability that they hold. The advantage of this probabilistic language is that it naturally expresses a generative process over interpretations using a declarative model. Interpretations are relational descriptions or possible worlds. In this paper, a novel parameter estimation algorithm CoPrEM for learning ProbLog programs from partial interpretations is introduced. The algorithm is essentially a SoftEM algorithm that computes binary decision diagrams for each interpretation allowing for a dynamic programming approach to be implemented. The CoPrEM algorithm has been experimentally evaluated on a number of data sets, which justify the approach and show its effectiveness.
Integrating Planning, Execution and Learning to Improve Plan Execution
"... Algorithms for planning under uncertainty require accurate action models that explicitly capture the uncertainty of the environment. Unfortunately, obtaining these models is usually complex. In environments with uncertainty, actions may produce countless outcomes and hence, specifying them and thei ..."
Abstract
 Add to MetaCart
(Show Context)
Algorithms for planning under uncertainty require accurate action models that explicitly capture the uncertainty of the environment. Unfortunately, obtaining these models is usually complex. In environments with uncertainty, actions may produce countless outcomes and hence, specifying them and their probability is a hard task. As a consequence, when implementing agents with planning capabilities, practitioners frequently opt for architectures that interleave classical planning and execution monitoring following a replanning when failure paradigm. Though this approach is more practical, it may produce fragile plans that need continuous replanning episodes or even worse, that result in execution deadends. In this paper, we propose a new architecture to relieve these shortcomings. The architecture is based on the integration of a relational learning component and the traditional planning and execution monitoring components. The new component allows the architecture to learn probabilistic rules of the success of actions from the execution of plans and to automatically upgrade the planning model with these rules. The upgraded models can be used by any classical planner that handles metric functions or, alternatively, by any probabilistic planner. This architecture proposal is designed to integrate offtheshelf interchangeable planning and learning components so it can profit from the last advances in both fields without modifying the architecture.
AciForager: Incrementally Discovering Regions of Correlated Change in Evolving Graphs
"... components, fault detection ..."
Barcelona, Spain Preface
, 2010
"... Many real world data collections and information structures are dynamic and rich in information. Represented as complex networks, a mesh of interconnected information entities, allows to get the most out of them. Analysing complex networks regarding nontrivial aspects such as information propagatio ..."
Abstract
 Add to MetaCart
(Show Context)
Many real world data collections and information structures are dynamic and rich in information. Represented as complex networks, a mesh of interconnected information entities, allows to get the most out of them. Analysing complex networks regarding nontrivial aspects such as information propagation, evolution, or community discovery and discovering or even creating new connections requires powerful tools for supporting visualisation, interaction and mining of and with the underlying data. While many graph mining tools have been developed there is a lack of techniques able to tackle realworld networks which are typically multidimensional, heterogeneous, and/or dynamic. Furthermore, sophisticated tools supporting the interaction with huge networks are virtually nonexistent. Besides refined mining and aggregation methods there is a need for modelling and sophisticated visualisation methods which take the limited amount of information into account a user can process. The aim of the workshop is to bring together pioneering researchers focussing on the analysis of complex networks, and thus intensify the exchange of ideas between different research communities to foster devising tools for creation, analysis, and visualisation of complex networks. The workshop focusses especially on researchers working on methods for mining, learning and analysis of (dynamic) networks, the representation and semantics of complex networks, and visualisation methods as well as user interface design. Out of the submitted papers, we have selected two extended abstracts and eight full papers. Each of these papers deals with some aspects of the analysis of complex networks. All the papers and presentations will be available on the workshop website
Learning Probabilistic Relational Models from Sequential Video Data with Applications in Tabletop and Card Games
"... Being able to understand complex dynamic scenes of realworld activities from lowlevel sensor data is of central importance for truly intelligent systems. The main difficulty lies in the fact that complex scenes are best described in highlevel, logical formalisms, whereas sensor data – for examp ..."
Abstract
 Add to MetaCart
Being able to understand complex dynamic scenes of realworld activities from lowlevel sensor data is of central importance for truly intelligent systems. The main difficulty lies in the fact that complex scenes are best described in highlevel, logical formalisms, whereas sensor data – for example images derived from a video camera – usually consists of many lowlevel, numerical and presumably noisy feature values. In this work, we consider the problem of learning highlevel, logical descriptions of dynamic scenes based on only the input video stream. As an example, consider a surveillance camera in a metro station. Whereas the video data will consist of many noisy images, a highlevel