Results 1  10
of
12
Inference in belief networks: A procedural guide
 International Journal of Approximate Reasoning
, 1996
"... Belief networks are popular tools for encoding uncertainty in expert systems. These networks rely on inference algorithms to compute beliefs in the context of observed evidence. One established method for exact inference onbelief networks is the Probability Propagation in Trees of Clusters (PPTC) al ..."
Abstract

Cited by 149 (6 self)
 Add to MetaCart
Belief networks are popular tools for encoding uncertainty in expert systems. These networks rely on inference algorithms to compute beliefs in the context of observed evidence. One established method for exact inference onbelief networks is the Probability Propagation in Trees of Clusters (PPTC) algorithm, as developed byLauritzen and Spiegelhalter and re ned by Jensen et al. [1, 2, 3] PPTC converts the belief network into a secondary structure, then computes probabilities by manipulating the secondary structure. In this document, we provide a selfcontained, procedural guide to understanding and implementing PPTC. We synthesize various optimizations to PPTC that are scattered throughout the literature. We articulate undocumented, \open secrets " that are vital to producing a robust and e cient implementation of PPTC. We hope that this document makes probabilistic inference more accessible and a ordable to those without extensive prior exposure.
Lazy Propagation in Junction Trees
 In Proc. 14th Conf. on Uncertainty in Artificial Intelligence
, 1998
"... The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian networks can be improved by exploiting independence relations induced by evidence and the direction of the links in the original network. In this paper we present an algorithm that exploits the independen ..."
Abstract

Cited by 36 (8 self)
 Add to MetaCart
The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian networks can be improved by exploiting independence relations induced by evidence and the direction of the links in the original network. In this paper we present an algorithm that exploits the independences relations induced by evidence and the direction of the links in the original network to reduce both time and space costs. Instead of multiplying the conditional probability distributions for the various cliques, we store a script specifying which potentials to multiply when a message is to be produced. The performance improvement of the algorithm is emphasized through empirical evaluations involving large real world Bayesian networks, and we compare the method with the Hugin and ShaferShenoy inference algorithms. 1 Introduction It has for a long time been a puzzle why "standard" inference algorithms for Bayesian networks did not really use the direction of the links in the network. By ...
Efficient computation for the noisy max
 International Journal of Intelligent Systems
, 2002
"... Díez’s algorithm for the noisy MAX is very efficient for polytrees, but when the network has loops it has to be combined with local conditioning, a suboptimal propagation algorithm. Other algorithms, based on several factorizations of the conditional probability of the noisy MAX, are not as efficien ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Díez’s algorithm for the noisy MAX is very efficient for polytrees, but when the network has loops it has to be combined with local conditioning, a suboptimal propagation algorithm. Other algorithms, based on several factorizations of the conditional probability of the noisy MAX, are not as efficient for polytrees, but can be combined with general propagation algorithms, such as clustering or variable elimination, which are more efficient for networks with loops. In this paper we propose a new factorization of the noisy MAX that amounts to Díez’s algorithm in the case of polytrees and at the same time is more efficient than previous factorizations when combined with either variable elimination or clustering. 1
An efficient factorization for the noisy MAX
 International Journal of Intelligent Systems
"... Díez’s algorithm for the noisy MAX is very efficient for polytrees, but when the network has loops it has to be combined with local conditioning, a suboptimal propagation algorithm. Other algorithms, based on several factorizations of the conditional probability of the noisy MAX, are not as efficien ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Díez’s algorithm for the noisy MAX is very efficient for polytrees, but when the network has loops it has to be combined with local conditioning, a suboptimal propagation algorithm. Other algorithms, based on several factorizations of the conditional probability of the noisy MAX, are not as efficient for polytrees, but can be combined with general propagation algorithms, such as clustering or variable elimination, which are more efficient for networks with loops. In this paper we propose a new factorization of the noisy MAX that amounts to Díez’s algorithm in the case of polytrees and at the same time is more efficient than previous factorizations when combined with either variable elimination or clustering. 1
Parallelization of Inference in Bayesian Networks
, 1999
"... This report gives a survey of different approaches to parallelization of probabilistic inference in Bayesian networks. Results from preliminary experiments are presented, and the conclusion is that the largest performance improvements are obtained with low level parallelization of the potential oper ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This report gives a survey of different approaches to parallelization of probabilistic inference in Bayesian networks. Results from preliminary experiments are presented, and the conclusion is that the largest performance improvements are obtained with low level parallelization of the potential operations performed during inference.
A Bayesian network based framework for multicriteria decision making
 in MCDM 2004
, 2004
"... Summary: MultiCriteria Decision Making (MCDM) involves the selection of the best actions from a set of alternatives, each of which is evaluated against multiple, and often conflicting, criteria. Most of the existing MCDM methods only focus on decisions under certainty. The criteria were evaluated s ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Summary: MultiCriteria Decision Making (MCDM) involves the selection of the best actions from a set of alternatives, each of which is evaluated against multiple, and often conflicting, criteria. Most of the existing MCDM methods only focus on decisions under certainty. The criteria were evaluated separately as if they were independent of each other. Complex, often uncertain interactions between criteria, and between criteria and other factors are not modeled in a coherent and systematic manner. To address these issues, we propose in this paper a decision framework based on Bayesian networks (BN) and influence diagram (ID) to structure and manage MCDM problems with explicit modeling of uncertain interactions among entities of interest. In this framework, a decision problem is represented by an ID where each decision node represents the set of alternatives for a decision, a utility node represents the set of objectives (decision maker’s preferences), decision criteria and internal or external factors that may affect the criteria are represented by chance nodes. Interdependencies among these nodes are qualitatively modeled by the links in the diagram and quantitatively by conditional probability tables (CPT) associated with each of the chance nodes and the utility node. The joint probability distribution, which is compactly captured by the network structure and CPT, encodes the domain expert’s knowledge of interdependency between variables. The decision problem is then treated as an optimization problem:
Tools for explanation in Bayesian networks with application to an agricultural problem
 In Proceedings of the
, 1997
"... ..."
Simplifying Explanations in Bayesian Belief Networks
 International Journal of Uncertainty, Fuzziness and KnowledgeBased Systems (IJUFKS
, 1994
"... Abductive inference in Bayesian belief networks is intended as the process of generating the K most probable configurations given an observed evidence. These configurations are called explanations and in most of the approaches found in the literature, all the explanations have the same number of lit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abductive inference in Bayesian belief networks is intended as the process of generating the K most probable configurations given an observed evidence. These configurations are called explanations and in most of the approaches found in the literature, all the explanations have the same number of literals. In this paper we study how to simplify the explanations in such a way that the resulting configurations are still accounting for the observed facts.
Handling Manipulated Evidence
, 2004
"... Abstract. Bayesian Networks have been advocated as a useful tool to describe the relations of dependence/independence among random variables and relevant hypotheses in a crime case. Moreover, they have been applied to help the investigator structure the problem and weight the observed evidence, typi ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Bayesian Networks have been advocated as a useful tool to describe the relations of dependence/independence among random variables and relevant hypotheses in a crime case. Moreover, they have been applied to help the investigator structure the problem and weight the observed evidence, typically with respect to the hypothesis of guilt of a suspect. In this paper we describe a model to handle the possibility that one or more pieces of evidence have been manipulated in order to mislead the investigations. This method is based on causal inference models, although it is developed in a different, specific framework.
A Differential Semantics of Lazy AR Propagation
"... In this paper we present a differential semantics of Lazy AR Propagation (LARP) in discrete Bayesian networks. We describe how both single and multi dimensional partial derivatives of the evidence may easily be calculated from a junction tree in LARP equilibrium. We show that the simplicity of the c ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we present a differential semantics of Lazy AR Propagation (LARP) in discrete Bayesian networks. We describe how both single and multi dimensional partial derivatives of the evidence may easily be calculated from a junction tree in LARP equilibrium. We show that the simplicity of the calculations stems from the nature of LARP. Based on the differential semantics we describe how variable propagation in the LARP architecture may give access to additional partial derivatives. The cautious LARP (cLARP) scheme is derived to produce a flexible cLARP equilibrium that offers additional opportunities for calculating single and multi dimensional partial derivatives of the evidence and subsets of the evidence from a single propagation. The results of an empirical evaluation illustrates how the access to a largely increased number of partial derivatives comes at a low computational cost.