Results 1  10
of
20
Compiling relational bayesian networks for exact inference
 International Journal of Approximate Reasoning
, 2004
"... We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available Primula tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evalua ..."
Abstract

Cited by 54 (11 self)
 Add to MetaCart
We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available Primula tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose Primula–generated propositional instances have thousands of variables, and whose jointrees have clusters with hundreds of variables.
MEBN: A Language for FirstOrder Bayesian Knowledge Bases
"... Although classical firstorder logic is the de facto standard logical foundation for artificial intelligence, the lack of a builtin, semantically grounded capability for reasoning under uncertainty renders it inadequate for many important classes of problems. Probability is the bestunderstood and m ..."
Abstract

Cited by 45 (18 self)
 Add to MetaCart
Although classical firstorder logic is the de facto standard logical foundation for artificial intelligence, the lack of a builtin, semantically grounded capability for reasoning under uncertainty renders it inadequate for many important classes of problems. Probability is the bestunderstood and most widely applied formalism for computational scientific reasoning under uncertainty. Increasingly expressive languages are emerging for which the fundamental logical basis is probability. This paper presents MultiEntity Bayesian Networks (MEBN), a firstorder language for specifying probabilistic knowledge bases as parameterized fragments of Bayesian networks. MEBN fragments (MFrags) can be instantiated and combined to form arbitrarily complex graphical probability models. An MFrag represents probabilistic relationships among a conceptually meaningful group of uncertain hypotheses. Thus, MEBN facilitates representation of knowledge at a natural level of granularity. The semantics of MEBN assigns a probability distribution over interpretations of an associated classical firstorder theory on a finite or countably infinite domain. Bayesian inference provides both a proof theory for combining prior knowledge with observations, and a learning theory for refining a representation as evidence accrues. A proof is given that MEBN can represent a probability distribution on interpretations of any finitely axiomatizable firstorder theory.
Generalpurpose mcmc inference over relational structures
 In Proceedings of the Proceedings of the TwentySecond Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI06
"... Tasks such as record linkage and multitarget tracking, which involve reconstructing the set of objects that underlie some observed data, are particularly challenging for probabilistic inference. Recent work has achieved efficient and accurate inference on such problems using Markov chain Monte Carl ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
Tasks such as record linkage and multitarget tracking, which involve reconstructing the set of objects that underlie some observed data, are particularly challenging for probabilistic inference. Recent work has achieved efficient and accurate inference on such problems using Markov chain Monte Carlo (MCMC) techniques with customized proposal distributions. Currently, implementing such a system requires coding MCMC state representations and acceptance probability calculations that are specific to a particular application. An alternative approach, which we pursue in this paper, is to use a generalpurpose probabilistic modeling language (such as BLOG) and a generic MetropolisHastings MCMC algorithm that supports usersupplied proposal distributions. Our algorithm gains flexibility by using MCMC states that are only partial descriptions of possible worlds; we provide conditions under which MCMC over partial worlds yields correct answers to queries. We also show how to use a contextspecific Bayes net to identify the factors in the acceptance probability that need to be computed for a given proposed move. Experimental results on a citation matching task show that our generalpurpose MCMC engine compares favorably with an applicationspecific system. 1
MEBN: A Logic for OpenWorld Probabilistic Reasoning
 Research Paper
, 2004
"... Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Probability is the most wellunderstood and widely applied logic for computational scientific reasoning under uncertainty. As theory and practice advance, generalpurpose languages are beginning to emerge for which ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Probability is the most wellunderstood and widely applied logic for computational scientific reasoning under uncertainty. As theory and practice advance, generalpurpose languages are beginning to emerge for which the fundamental logical basis is probability. However, such languages have lacked a logical foundation that fully integrates classical firstorder logic with probability theory. This paper presents such an integrated logical foundation. A formal specification is presented for multientity Bayesian networks (MEBN), a knowledge representation language based on directed graphical probability models. A proof is given that a probability distribution over interpretations of any consistent, finitely axiomatizable firstorder theory can be defined using MEBN. A semantics based on random variables provides a logically coherent foundation for open world reasoning and a means of analyzing tradeoffs between accuracy and computation cost. Furthermore, the underlying Bayesian logic is inherently open, having the ability to absorb new facts about the world, incorporate them into existing theories, and/or modify theories in the light of evidence. Bayesian inference provides both a proof theory for combining prior knowledge with observations, and a learning theory for refining a representation as evidence accrues. The results of this paper provide a logical foundation for the rapidly evolving literature on firstorder Bayesian knowledge representation, and point the way toward Bayesian languages suitable for generalpurpose knowledge representation and computing. Because firstorder Bayesian logic contains classical firstorder logic as a deterministic subset, it is a natural candidate as a universal representation for integrating domain ontologies expressed in languages based on classical firstorder logic or subsets thereof.
Parameter learning for relational bayesian networks
 In Proceedings of the International Conference in Machine Learning
, 2007
"... We present a method for parameter learning in relational Bayesian networks (RBNs). Our approach consists of compiling the RBN model into a computation graph for the likelihood function, and to use this likelihood graph to perform the necessary computations for a gradient ascent likelihood optimizati ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We present a method for parameter learning in relational Bayesian networks (RBNs). Our approach consists of compiling the RBN model into a computation graph for the likelihood function, and to use this likelihood graph to perform the necessary computations for a gradient ascent likelihood optimization procedure. The method can be applied to all RBN models that only contain differentiable combining rules. This includes models with nondecomposable combining rules, as well as models with weighted combinations or nested occurrences of combining rules. Experimental results on artificial random graph data explores the feasibility of the approach both for complete and incomplete data. 1.
New advances in logicbased probabilistic modeling by PRISM
 Probabilistic Inductive Logic Programming
, 2008
"... Abstract. We review a logicbased modeling language PRISM and report recent developments including belief propagation by the generalized insideoutside algorithm and generative modeling with constraints. The former implies PRISM subsumes belief propagation at the algorithmic level. We also compare t ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Abstract. We review a logicbased modeling language PRISM and report recent developments including belief propagation by the generalized insideoutside algorithm and generative modeling with constraints. The former implies PRISM subsumes belief propagation at the algorithmic level. We also compare the performance of PRISM with stateoftheart systems in statistical natural language processing and probabilistic inference in Bayesian networks respectively, and show that PRISM is reasonably competitive. 1
FirstOrder Bayesian Logic
, 2005
"... Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Until recently, classical firstorder logic has reigned as the de facto standard logical foundation for artificial intelligence. The lack of a builtin, semantically grounded capability for reasoning under uncertai ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Uncertainty is a fundamental and irreducible aspect of our knowledge about the world. Until recently, classical firstorder logic has reigned as the de facto standard logical foundation for artificial intelligence. The lack of a builtin, semantically grounded capability for reasoning under uncertainty renders classical firstorder logic inadequate for many important classes of problems. Generalpurpose languages are beginning to emerge for which the fundamental logical basis is probability. Increasingly expressive probabilistic languages demand a theoretical foundation that fully integrates classical firstorder logic and probability. In firstorder Bayesian logic (FOBL), probability distributions are defined over interpretations of classical firstorder axiom systems. Predicates and functions of a classical firstorder theory correspond to a random variables in the corresponding firstorder Bayesian theory. This is a natural correspondence, given that random variables are formalized in mathematical statistics as measurable functions on a probability space. A formal system called MultiEntity Bayesian Networks (MEBN) is presented for composing distributions on interpretations by instantiating and combining parameterized fragments of directed graphical models. A construction is given of a MEBN theory that assigns a nonzero
Propositional and relational Bayesian networks associated with imprecise and qualitative probabilistic assessments
 IN PROCEEDINGS OF THE 20TH ANNUAL CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE
, 2004
"... This paper investigates a representation language with flexibility inspired by probabilistic logic and compactness inspired by relational Bayesian networks. The goal is to handle propositional and firstorder constructs together with precise, imprecise, indeterminate and qualitative probabilistic as ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
This paper investigates a representation language with flexibility inspired by probabilistic logic and compactness inspired by relational Bayesian networks. The goal is to handle propositional and firstorder constructs together with precise, imprecise, indeterminate and qualitative probabilistic assessments. The paper shows how this can be achieved through the theory of credal networks. New exact and approximate inference algorithms based on multilinear programming and iterated/loopy propagation of interval probabilities are presented; their superior performance, compared to existing ones, is shown empirically.
The Occlusion Calculus
 IN PROC. WORKSHOP ON COGNITIVE VISION
, 2002
"... A lot of effort in Qualitative Reasoning had been spent in the RCC8 calculus. This paper proposes a calculus named OCC (Occlusion Calculus) closely related to Galton 's LOS14 calculus, that is more expressive in a vision context. The OCC relations qualitatively describe configurations from two con ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
A lot of effort in Qualitative Reasoning had been spent in the RCC8 calculus. This paper proposes a calculus named OCC (Occlusion Calculus) closely related to Galton 's LOS14 calculus, that is more expressive in a vision context. The OCC relations qualitatively describe configurations from two convex objects in the projective view from a 3D scene. To set OCC on a mathematical ground an axiomatisation of the derived relation calculus is given. Since OCC only focuses on one qualitative aspect of space it is sketched, how and when different calculi can be combined to assemble a knowledge base for a cognitive vision system on a conceptual level.
Generative Modeling by PRISM
"... Abstract. PRISM is a probabilistic extension of Prolog. It is a high level language for probabilistic modeling capable of learning statistical parameters from observed data. After reviewing it from various viewpoints, we examine some technical details related to logic programming, including semantic ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. PRISM is a probabilistic extension of Prolog. It is a high level language for probabilistic modeling capable of learning statistical parameters from observed data. After reviewing it from various viewpoints, we examine some technical details related to logic programming, including semantics, search and program synthesis. 1