Results 1  10
of
13
Answering Queries from ContextSensitive Probabilistic Knowledge Bases
 Theoretical Computer Science
, 1996
"... We define a language for representing contextsensitive probabilistic knowledge. A knowledge base consists of a set of universally quantified probability sentences that include context constraints, which allow inference to be focused on only the relevant portions of the probabilistic knowledge. We p ..."
Abstract

Cited by 93 (0 self)
 Add to MetaCart
We define a language for representing contextsensitive probabilistic knowledge. A knowledge base consists of a set of universally quantified probability sentences that include context constraints, which allow inference to be focused on only the relevant portions of the probabilistic knowledge. We provide a declarative semantics for our language. We present a query answering procedure which takes a query Q and a set of evidence E and constructs a Bayesian network to compute P (QjE). The posterior probability is then computed using any of a number of Bayesian network inference algorithms. We use the declarative semantics to prove the query procedure sound and complete. We use concepts from logic programming to justify our approach. Keywords: reasoning under uncertainty, Bayesian networks, Probability model construction, logic programming Submitted to Theoretical Computer Science special issue on Uncertainty in Databases and Deductive Systems. This work was partially supported by NSF g...
Logic programs with annotated disjunctions
 In Proc. Intâ€™l Conf. on Logic Programming
, 2004
"... Abstract. Current literature offers a number of different approaches to what could generally be called "probabilistic logic programming". These are usually based on Horn clauses. Here, we introduce a new formalism, Logic Programs with Annotated Disjunctions, based on disjunctive logic prog ..."
Abstract

Cited by 58 (5 self)
 Add to MetaCart
Abstract. Current literature offers a number of different approaches to what could generally be called "probabilistic logic programming". These are usually based on Horn clauses. Here, we introduce a new formalism, Logic Programs with Annotated Disjunctions, based on disjunctive logic programs. In this formalism, each of the disjuncts in the head of a clause is annotated with a probability. Viewing such a set of probabilistic disjunctive clauses as a probabilistic disjunction of normal logic programs allows us to derive a possible world semantics, more precisely, a probability distribution on the set of all Herbrand interpretations. We demonstrate the strength of this formalism by some examples and compare it to related work.
Generating Bayesian Networks from Probability Logic Knowledge Bases
 In Proceedings of the Tenth Conference on Uncertainty in Artificial Intelligence
, 1994
"... We present a method for dynamically generating Bayesian networks from knowledge bases consisting of firstorder probability logic sentences. We present a subset of probability logic sufficient for representing the class of Bayesian networks with discretevalued nodes. We impose constraints on the fo ..."
Abstract

Cited by 52 (8 self)
 Add to MetaCart
We present a method for dynamically generating Bayesian networks from knowledge bases consisting of firstorder probability logic sentences. We present a subset of probability logic sufficient for representing the class of Bayesian networks with discretevalued nodes. We impose constraints on the form of the sentences that guarantee that the knowledge base contains all the probabilistic information necessary to generate a network. We define the concept of dseparation for knowledge bases and prove that a knowledge base with independence conditions defined by dseparation is a complete specification of a probability distribution. We present a network generation algorithm that, given an inference problem in the form of a query Q and a set of evidence E, generates a network to compute P (QjE). We prove the algorithm to be correct. 1 Introduction The flexibility of Bayesian networks for representing probabilistic dependencies and the relative efficiency of computational techniques for p...
Probabilistic Logic Learning
 ACMSIGKDD Explorations: Special issue on MultiRelational Data Mining
, 2004
"... The past few years have witnessed an significant interest in probabilistic logic learning, i.e. in research lying at the intersection of probabilistic reasoning, logical representations, and machine learning. A rich variety of di#erent formalisms and learning techniques have been developed. This pap ..."
Abstract

Cited by 34 (8 self)
 Add to MetaCart
The past few years have witnessed an significant interest in probabilistic logic learning, i.e. in research lying at the intersection of probabilistic reasoning, logical representations, and machine learning. A rich variety of di#erent formalisms and learning techniques have been developed. This paper provides an introductory survey and overview of the stateof theart in probabilistic logic learning through the identification of a number of important probabilistic, logical and learning concepts.
Probabilistic Logic Programming and Bayesian Networks
 In Asian Computing Science Conference
, 1995
"... We present a probabilistic logic programming framework that allows the representation of conditional probabilities. While conditional probabilities are the most commonly used method for representing uncertainty in probabilistic expert systems, they have been largely neglected by work in quantitative ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
We present a probabilistic logic programming framework that allows the representation of conditional probabilities. While conditional probabilities are the most commonly used method for representing uncertainty in probabilistic expert systems, they have been largely neglected by work in quantitative logic programming. We define a fixpoint theory, declarative semantics, and proof procedure for the new class of probabilistic logic programs. Compared to other approaches to quantitative logic programming, we provide a true probabilistic framework with potential applications in probabilistic expert systems and decision support systems. We also discuss the relationship between such programs and Bayesian networks, thus moving toward a unification of two major approaches to automated reasoning. To appear in Proceedings of the 1995 Asian Computing Science Conference, Pathumthani, Thailand, December 1995. This work was partially supported by NSF grant IRI9509165. 1 Introduction Reasoning u...
Logical Bayesian Networks and their relation to other probabilistic logical models
 In Proceedings of 15th International Conference on Inductive Logic Pogramming (ILP05), volume 3625 of Lecture Notes in Artificial Intelligence
, 2005
"... We review Logical Bayesian Networks, a language for probabilistic logical modelling, and discuss its relation to Probabilistic Relational Models and Bayesian Logic Programs. 1 Probabilistic Logical Models Probabilistic logical models are models combining aspects of probability theory with aspects of ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
We review Logical Bayesian Networks, a language for probabilistic logical modelling, and discuss its relation to Probabilistic Relational Models and Bayesian Logic Programs. 1 Probabilistic Logical Models Probabilistic logical models are models combining aspects of probability theory with aspects of Logic Programming, firstorder logic or relational languages. Recently a variety of languages to describe such models has been introduced. For some languages techniques exist to learn such models from data. Two examples are Probabilistic Relational Models (PRMs) [4] and Bayesian Logic Programs (BLPs) [5]. These two languages are probably the most popular and wellknown in the Relational Data Mining community. We introduce a new language, Logical Bayesian Networks (LBNs) [2], that is strongly related to PRMs and BLPs yet solves some of their problems with respect to knowledge representation (related to expressiveness and intuitiveness). PRMs, BLPs and LBNs all follow the principle of Knowledge Based Model Construction: they offer a language that can be used to specify general probabilistic logical knowledge and they provide a methodology to construct a propositional model based on this knowledge when given a specific
CPlogic: A Language of Causal Probabilistic Events and Its Relation to Logic Programming
"... We examine the relation between constructive processes and the concept of causality. We observe that causality has an inherent dynamic aspect, i.e., that, in essence, causal information concerns the evolution of a domain over time. Motivated by this observation, we construct a new representation lan ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We examine the relation between constructive processes and the concept of causality. We observe that causality has an inherent dynamic aspect, i.e., that, in essence, causal information concerns the evolution of a domain over time. Motivated by this observation, we construct a new representation language for causal knowledge, whose semantics is defined explicitly in terms of constructive processes. This is done in a probabilistic context, where the basic steps that make up the process are allowed to have nondeterministic effects. We then show that a theory in this language defines a unique probability distribution over the possible outcomes of such a process. This result offers an appealing explanation for the usefulness of causal information and links our explicitly dynamic approach to more static causal probabilistic modeling languages, such as Bayesian networks. We also show that this language, which we have constructed to be a natural formalization of a certain kind of causal statements, is closely related to logic programming. This result demonstrates that, under an appropriate formal semantics, a rule of a normal, a disjunctive or a certain kind of probabilistic logic program can be interpreted as a description of a causal event.
Statistical abduction with tabulation
, 2000
"... We propose statistical abduction as a rstorder logical framework for representing, inferring and learning probabilistic knowledge. It semantically integrates logical abduction with a parameterized distribution over abducibles. We show that statistical abduction combined with tabulated search provid ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We propose statistical abduction as a rstorder logical framework for representing, inferring and learning probabilistic knowledge. It semantically integrates logical abduction with a parameterized distribution over abducibles. We show that statistical abduction combined with tabulated search provides an e cient algorithm for probability computation, a Viterbilike algorithm for nding the most likely explanation, and an EM learning algorithm (the graphical EM algorithm) for learning parameters associated with the distribution which achieve the same computational complexity as those specialized algorithms for HMMs (hidden Markov models), PCFGs (probabilistic contextfree grammars) and scBNs (singly connected Bayesian networks).
Principled Construction of Minimal Bayesian Networks from Probability Logic Knowledge Bases
 Journal of AI Research
"... We present a method for dynamically constructing Bayesian networks from knowledge bases consisting of firstorder probability logic sentences. We present a subset of probability logic sufficient for representing the class of Bayesian networks with discretevalued nodes. We impose constraints on the f ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We present a method for dynamically constructing Bayesian networks from knowledge bases consisting of firstorder probability logic sentences. We present a subset of probability logic sufficient for representing the class of Bayesian networks with discretevalued nodes. We impose constraints on the form of the sentences that guarantee that the knowledge base contains all the probabilistic information necessary to construct a network. We define the concept of dseparation for knowledge bases and prove that a knowledge base with independence conditions defined by dseparation is a complete specification of a probability distribution. We present a network construction algorithm that, given an inference problem in the form of a query Q and a set of evidence E, constructs the smallest network to compute P (QjE). We prove the algorithm to be correct. Submitted to Journal of AI Research 1 Introduction The flexibility of Bayesian networks for representing probabilistic dependencies and the ...