Results 11  20
of
45
Ontologybased reasoning techniques for multimedia interpretation and retrieval
, 2007
"... ..."
(Show Context)
K.: Parameter learning in probabilistic databases: A least squares approach
, 2008
"... Abstract. We introduce the problem of learning the parameters of the probabilistic database ProbLog. Given the observed success probabilities of a set of queries, we compute the probabilities attached to facts that have a low approximation error on the training examples as well as on unseen examples ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce the problem of learning the parameters of the probabilistic database ProbLog. Given the observed success probabilities of a set of queries, we compute the probabilities attached to facts that have a low approximation error on the training examples as well as on unseen examples. Assuming Gaussian error terms on the observed success probabilities, this naturally leads to a least squares optimization problem. Our approach, called LeProbLog, is able to learn both from queries and from proofs and even from both simultaneously. This makes it flexible and allows faster training in domains where the proofs are available. Experiments on real world data show the usefulness and effectiveness of this least squares calibration of probabilistic databases. 1
Expectation Maximization over Binary Decision Diagrams for Probabilistic Logic Programs
"... Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a M ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
(Show Context)
Recently much work in Machine Learning has concentrated on using expressive representation languages that combine aspects of logic and probability. A whole field has emerged, called Statistical Relational Learning, rich of successful applications in a variety of domains. In this paper we present a Machine Learning technique targeted to Probabilistic Logic Programs, a family of formalisms where uncertainty is represented using Logic Programming tools. Among various proposals for Probabilistic Logic Programming, the one based on the distribution semantics is gaining popularity and is the basis for languages such as ICL, PRISM, ProbLog andLogic Programs with Annotated Disjunctions. This paper proposes a technique for learning parameters of these languages. Since their equivalent Bayesian networks contain hidden variables, an Expectation Maximization (EM) algorithm is adopted. In order to speed the computation up, expectations are computed directly on the Binary Decision Diagrams that are built for inference. The resulting system, called EMBLEM for “EM over Bdds for probabilistic Logic programs Efficient Mining”, has been applied to a number of datasets and showed good performances both in terms of speed and memory usage. In particular its speed allows the execution of a high number of restarts, resulting in good quality of the solutions.
Exploiting the Rule Structure for Decision Making within the Independent Choice Logic
, 1995
"... This paper introduces the independent choice logic, and in particular the "single agent with nature " instance of the independent choice logic, namely ICL DT . This is a logical framework for decision making uncertainty that extends both logic programming and stochastic models such as infl ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
This paper introduces the independent choice logic, and in particular the "single agent with nature " instance of the independent choice logic, namely ICL DT . This is a logical framework for decision making uncertainty that extends both logic programming and stochastic models such as influence diagrams. This paper shows how the representation of a decision problem within the independent choice logic can be exploited to cut down the combinatorics of dynamic programming. One of the main problems with influence diagram evaluation techniques is the need to optimise a decision for all values of the `parents' of a decision variable. In this paper we show how the rule based nature of the ICL DT can be exploited so that we only make distinctions in the values of the information available for a decision that will make a difference to utility. 1
Logical argumentation, abduction and Bayesian decision theory: a Bayesian approach to logical arguments and its application to legal evidential reasoning
 Cardozo Law Review
"... There are good normative arguments for using Bayesian decision theory for deciding what to do. However, there are also good arguments for using logic, where we want have a formal semantics for a language and use the structure of logical argumentation with logical variables to represent multiple indi ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
There are good normative arguments for using Bayesian decision theory for deciding what to do. However, there are also good arguments for using logic, where we want have a formal semantics for a language and use the structure of logical argumentation with logical variables to represent multiple individuals (things). This paper shows how decision theory and logical argumentation can be combined into a coherent framework. The Independent Choice Logic can be viewed as firstorder representation of belief networks with conditional probability tables represented as firstorder rules, or as a abductive/argumentbased logic with probabilities over assumables. Intuitively we can use logic to model causally (in terms of logic programs with assumables). Given evidence, we abduce to the explanations, and then can predict what follows from these explanations. As well as abduction to the best explanation(s), from which we can bound probabilities, we can also do marginalization to reduce the detail of arguments. An example of Tillers is given is used to show the how the framework could be used for legal reasoning. The code to run this example is available from the authors web site. 1 1
F.: Learning the structure of probabilistic logic programs
 ILP 2011. LNCS
, 2012
"... Abstract. There is a growing interest in the field of Probabilistic Inductive Logic Programming, which uses languages that integrate logic programming and probability. Many of these languages are based on the distribution semantics and recently various authors have proposed systems for learning the ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract. There is a growing interest in the field of Probabilistic Inductive Logic Programming, which uses languages that integrate logic programming and probability. Many of these languages are based on the distribution semantics and recently various authors have proposed systems for learning the parameters (PRISM, LeProbLog, LFIProbLog and EMBLEM) or both the structure and the parameters (SEMCPlogic) of these languages. EMBLEM for example uses an Expectation Maximization approach in which the expectations are computed on Binary Decision Diagrams. In this paper we present the algorithm SLIPCASE for “Structure LearnIng of ProbabilistiC logic progrAmS with Em over bdds”. It performs a beam search in the space of the language of Logic Programs with Annotated Disjunctions (LPAD) using the log likelihood of the data as the guiding heuristics. To estimate the log likelihood of theory refinements it performs a limited number of Expectation Maximization iterations of EMBLEM. SLIPCASE has been tested on three realworld datasetsandcomparedwithSEMCPlogic andLearningusing Structural Motifs, an algorithm for Markov Logic Networks. The results show that SLIPCASE achieves higher areas under the precisionrecall and ROC curves and is more scalable.
Logic, Knowledge Representation and Bayesian Decision Theory
 IN PROCEEDINGS CL2000, VOL. 1861 OF LNCS
, 2000
"... In this paper I give a brief overview of recent work on uncertainty in AI, and relate it to logical representations. Bayesian decision theory and logic are both normative frameworks for reasoning that emphasize different aspects of intelligent reasoning. Belief networks (Bayesian networks) are re ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper I give a brief overview of recent work on uncertainty in AI, and relate it to logical representations. Bayesian decision theory and logic are both normative frameworks for reasoning that emphasize different aspects of intelligent reasoning. Belief networks (Bayesian networks) are representations of independence that form the basis for understanding much of the recent work on reasoning under uncertainty, evidential and causal reasoning, decision analysis, dynamical systems, optimal control, reinforcement learning and Bayesian learning. The independent choice logic provides a bridge between logical representations and belief networks that lets us understand these other representations and their relationship to logic and shows how they can extended to firstorder rulebased representations.
Distributed Medical Diagnosis with Abductive Logic Agents
"... We describe the application of a multiagent system for the distributed diagnosis of infections within an hospital. Diagnosing infections within an hospital is a complex task that may require to collect data (e.g. analysis results, details on patient's clinical history, diagnosis hypotheses) fr ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We describe the application of a multiagent system for the distributed diagnosis of infections within an hospital. Diagnosing infections within an hospital is a complex task that may require to collect data (e.g. analysis results, details on patient's clinical history, diagnosis hypotheses) from several information sources (such as, for example, analysis laboratories, hospital wards. These sources act autonomously and they often have a partial knowledge about patients health, their clinical history, and medical information in general. As a natural consequence, this may lead a single entity (e.g., a specialist) to formulate incorrect diagnosis. In such a context, to obtain a correct diagnosis on the basis of information coming from di#erent sources, a coordination mechanism is needed for the integration of collected data into a final diagnosis which should be compatible both with patient's anamnesis and other knowledge (possibly distributed over the system) related to the clinical case. In the paper we face this problem by using abduction, which is a reasoning mechanism for formulating hypotheses in the case of incomplete knowledge, suitably etended to a multiagent setting. In particular, we first apply ALIAS abductive agents to distributed diagnosis and show how the coordination mechanisms provided by such system are well suited when composing several (possibly partial) diagnosis into a final response, which is consistent with the knowledge of involved agents (i.e., hospital entities or specialist doctors). In the second part of the paper, we extend basic ALIAS coordination mechanisms towards probabilistic abduction. In this way, several (possibly partial) diagnosis obtained by probabilistic abductive reasoning can be merged into a final set of abductive diagnosis, ...
Integrating by Separating: Combining Probability and Logic with ICL, PRISM and SLPs
, 2005
"... This report describes the close relationship that obtains between the ICL, PRISM and SLP frameworks. The common feature of these frameworks is that a purely probabilistic component and a purely logical component are connected to produce a hybrid model. A hidden Markov model (HMM) is used as a runnin ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
This report describes the close relationship that obtains between the ICL, PRISM and SLP frameworks. The common feature of these frameworks is that a purely probabilistic component and a purely logical component are connected to produce a hybrid model. A hidden Markov model (HMM) is used as a running example. The Uniqueness Condition, which allows these frameworks to represent statistical models is discussed, and the consequences of using a weakened version of the Uniqueness Condition briefly explored. ‘Lazy ’ sampling, based on SLDresolution, is discussed. 1
Aggregates for Constraint Handling Rules
 In Djelloul, Duck et
, 2007
"... Abstract. We extend the Constraint Handling Rules language with aggregates such as sum, count, findall, and min. The proposed extension features nested aggregate expressions over guarded conjunctions of constraints, a series of predefined aggregates, and applicationtailored userdefined aggregates. ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. We extend the Constraint Handling Rules language with aggregates such as sum, count, findall, and min. The proposed extension features nested aggregate expressions over guarded conjunctions of constraints, a series of predefined aggregates, and applicationtailored userdefined aggregates. We formally define the operational semantics of aggregates, and show how incremental aggregate computation facilitates efficient implementations. Case studies demonstrate that language support for aggregates significantly reduces program size, thus improving readability and maintainability considerably. 1