Results 1  10
of
25
Optimal Junction Trees
 In UAI
, 1994
"... The paper deals with optimality issues in connection with updating beliefs in networks. We address two processes: triangulation and construction of junction trees. In the first part, we give a simple algorithm for constructing an optimal junction tree from a triangulated network. In the second part, ..."
Abstract

Cited by 81 (0 self)
 Add to MetaCart
The paper deals with optimality issues in connection with updating beliefs in networks. We address two processes: triangulation and construction of junction trees. In the first part, we give a simple algorithm for constructing an optimal junction tree from a triangulated network. In the second part, we argue that any exact method based on local calculations must either be less efficient than the junction tree method, or it has an optimality problem equivalent to that of triangulation. 1 INTRODUCTION The junction tree propagation method (Jensen et al., 1990
An efficient algorithm for finding the M most probable configurations in probabilistic expert systems
 Statistics and Computing
, 1998
"... A probabilistic expert system provides a graphical representation of a joint probability distribution which enables local computations of probabilities. Dawid (1992) provided a `flowpropagation' algorithm for finding the most probable configuration of the joint distribution in such a system. This p ..."
Abstract

Cited by 68 (3 self)
 Add to MetaCart
A probabilistic expert system provides a graphical representation of a joint probability distribution which enables local computations of probabilities. Dawid (1992) provided a `flowpropagation' algorithm for finding the most probable configuration of the joint distribution in such a system. This paper analyses that algorithm in detail, and shows how it can be combined with a clever partitioning scheme to formulate an efficient method for finding the M most probable configurations. The algorithm is a divide and conquer technique, that iteratively identifies the M most probable configurations. The algorithm has been implemented into the experimental shell XBAIES, which is an extension of BAIES (Cowell, 1992). Keywords: Bayesian network, belief revision, most probable explanation, junction tree, maximization, propagation, charge, potential function, conditional independence, flow, evidence, marginalization, divideandconquer. 1 Introduction A probabilistic expert system (PES) funct...
Blocking Gibbs Sampling in Very Large Probabilistic Expert Systems
 Internat. J. Human–Computer Studies
, 1995
"... We introduce a methodology for performing approximate computations in very complex probabilistic systems (e.g. huge pedigrees). Our approach, called blocking Gibbs, combines exact local computations with Gibbs sampling in a way that complements the strengths of both. The methodology is illustrate ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
We introduce a methodology for performing approximate computations in very complex probabilistic systems (e.g. huge pedigrees). Our approach, called blocking Gibbs, combines exact local computations with Gibbs sampling in a way that complements the strengths of both. The methodology is illustrated on a realworld problem involving a heavily inbred pedigree containing 20;000 individuals. We present results showing that blockingGibbs sampling converges much faster than plain Gibbs sampling for very complex problems.
A Method for Implementing a Probabilistic Model as a Relational Database
 In Eleventh Conference on Uncertainty in Artificial Intelligence
, 1995
"... This paper discusses a method for implementing a probabilistic inference system based on an extended relational data model. ..."
Abstract

Cited by 29 (19 self)
 Add to MetaCart
This paper discusses a method for implementing a probabilistic inference system based on an extended relational data model.
A Computational Scheme For Reasoning In Dynamic Probabilistic Networks
, 1992
"... A computational scheme for reasoning about dynamic systems using (causal) probabilistic networks is presented. The scheme is based on the framework of Lauritzen & Spiegelhalter (1988), and may be viewed as a generalization of the inference methods of classical timeseries analysis in the sense th ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
A computational scheme for reasoning about dynamic systems using (causal) probabilistic networks is presented. The scheme is based on the framework of Lauritzen & Spiegelhalter (1988), and may be viewed as a generalization of the inference methods of classical timeseries analysis in the sense that it allows description of nonlinear, multivariate dynamic systems with complex conditional independence structures. Further, the scheme provides a method for efficient backward smoothing and possibilities for efficient, approximate forecasting methods. The scheme has been implemented on top of the HUGIN shell.
A `Microscopic' Study of Minimum Entropy Search in Learning Decomposable Markov Networks
 MACHINE LEARNING
, 1995
"... Several scoring metrics are used in different search procedures for learning probabilistic networks. We study the properties of cross entropy in learning a decomposable Markov network. Though entropy and related scoring metrics were widely used, its `microscopic' properties and asymptotic behavior i ..."
Abstract

Cited by 23 (18 self)
 Add to MetaCart
Several scoring metrics are used in different search procedures for learning probabilistic networks. We study the properties of cross entropy in learning a decomposable Markov network. Though entropy and related scoring metrics were widely used, its `microscopic' properties and asymptotic behavior in a search have not been analyzed. We present such a `microscopic' study of a minimum entropy search algorithm, and show that it learns an Imap of the domain model when the data size is large. Search procedures that modify a network structure one link at a time have been commonly used for efficiency. Our study indicates that a class of domain models cannot be learned by such procedures. This suggests that prior knowledge about the problem domain together with a multilink search strategy would provide an effective way to uncover many domain models.
Causal Probabilistic Networks With Both Discrete and Continuous Variables
, 1993
"... An extension of the expert system shell HUGIN to include continuous wriables, in the form of linear additive normally distributed variables, is presented. The ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
An extension of the expert system shell HUGIN to include continuous wriables, in the form of linear additive normally distributed variables, is presented. The
A Genetic algorithm to approximate convex sets of probabilities
, 1996
"... An Evolution Program is presented to propagate convex sets of probabilities. This algorithm is useful when the number of extreme points in the 'a posteriori' convex set for a variable is too high and a single probabilistic propagation is feasible. We have tested the algorithm in a random causal netw ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
An Evolution Program is presented to propagate convex sets of probabilities. This algorithm is useful when the number of extreme points in the 'a posteriori' convex set for a variable is too high and a single probabilistic propagation is feasible. We have tested the algorithm in a random causal network with a random number of conditional probabilities in each one of the variables. The experimental evaluation show that the resulting intervals obtained for the cases of a variable are similar to those obtained using an exact method of propagation.
Reduction of Computational Complexity in Bayesian Networks through Removal of Weak Dependences
 IN PROC. TENTH CONF. ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE
, 1994
"... The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, s ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, since several fillins and moral links may be rendered superfluous by the removal. The method is described in terms of impact on the independence graph, the junction tree, and the potential functions associated with these. An empirical evaluation of the method using large realworld networks demonstrates the applicability of the method. Further, the method, which has been implemented in Hugin, complements the approximation method suggested by Jensen & Andersen (1990).
Optimality issues in constructing a Markov tree from graphical models
 Computational and Graphical Statistics
, 1991
"... Several recent papers have described probability models which used graph and hypergraphs to represent relationships among the variables. Two related computing algorithms are commonly used to manipulate such models: the peeling algorithm which eliminates variables one at a time to find the marginal d ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Several recent papers have described probability models which used graph and hypergraphs to represent relationships among the variables. Two related computing algorithms are commonly used to manipulate such models: the peeling algorithm which eliminates variables one at a time to find the marginal distribution of a single variable, and the fusion and propagation algorithm which simultaneously solves for many marginal distributions by passing messages in a Tree of Cliques whose nodes correspond to subsets of variables. The peeling algorithm requires an elimination order. As demonstrated in this paper, the elimination order can in turn be used to construct a Tree of Cliques for propagation and fusion. This paper addresses three computational issues: 1) The choice of elimination order determines the size of the largest node of the Tree of Cliques, which dominates the computational cost for the probability model using either peeling or fusion and propagation. We review heuristics for choosing an elimination order. (2) Inserting intersection nodes into the tree of cliques produces a junction tree which has a lower computational cost. We present an algorithm which produces a junction trees with a high computational efficiency. (3) Augmenting the tree of cliques with additional nodes can lead to a new tree structure which more clearly expresses the relationship between the original graphical model and the tree model.