Results 1 
9 of
9
ContextSpecific Independence in Bayesian Networks
, 1996
"... Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. ..."
Abstract

Cited by 296 (30 self)
 Add to MetaCart
Bayesiannetworks provide a languagefor qualitatively representing the conditional independence properties of a distribution. This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms.
Localized Partial Evaluation of Belief Networks
, 1995
"... Most algorithms for propagating evidence through belief networks have been exact and exhaustive: they produce an exact (pointvalued) marginal probability for every node in the network. Often, however, an application will not need information about every node in the network nor will it need exact pr ..."
Abstract

Cited by 43 (1 self)
 Add to MetaCart
Most algorithms for propagating evidence through belief networks have been exact and exhaustive: they produce an exact (pointvalued) marginal probability for every node in the network. Often, however, an application will not need information about every node in the network nor will it need exact probabilities. We present the localized partial evaluation (LPE) propagation algorithm, which computes interval bounds on the marginal probability of a specified query node by examining a subset of the nodes in the entire network. Conceptually, LPE ignores parts of the network that are "too far away" from the queried node to have much impact on its value. LPE has the "anytime" property of being able to produce better solutions (tighter intervals) given more time to consider more of the network. 1 Introduction Belief networks provide a way of encoding knowledge about the probabilistic dependencies and independencies of a set of variables in some domain. Variables are encoded as nodes in the ne...
Local Conditioning in Bayesian Networks
 Artificial Intelligence
, 1996
"... Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the variable ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
Local conditioning (LC) is an exact algorithm for computing probability in Bayesian networks, developed as an extension of Kim and Pearl's algorithm for singlyconnected networks. A list of variables associated to each node guarantees that only the nodes inside a loop are conditioned on the variable which breaks it. The main advantage of this algorithm is that it computes the probability directly on the original network instead of building a cluster tree, and this can save time when debugging a model and when the sparsity of evidence allows a pruning of the network. The algorithm is also advantageous when some families in the network interact through AND/OR gates. A parallel implementation of the algorithm with a processor for each node is possible even in the case of multiplyconnected networks. 1 Introduction A Bayesian network is an acyclic directed graph in which every node represents a random variable, together with a probability distribution such that P (x 1 ; : : : ; x n ) = ...
Optimization of Pearl's Method of Conditioning and GreedyLike Approximation Algorithms for the Vertex Feedback Set Problem
 Artificial Intelligence
, 1997
"... We show how to find a small loop cutset in a Bayesian network. ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
We show how to find a small loop cutset in a Bayesian network.
Distributed inference in Bayesian networks
 CYBERNETICS AND SYSTEMS
, 1994
"... Bayesian networks originated as a framework for distributed reasoning. In singlyconnected networks, there exists an elegant inference algorithm that can be implemented in parallel having a processor for every node. It can be extended to take profit of the ORgate, a model of interaction among ca ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Bayesian networks originated as a framework for distributed reasoning. In singlyconnected networks, there exists an elegant inference algorithm that can be implemented in parallel having a processor for every node. It can be extended to take profit of the ORgate, a model of interaction among causes which simplifies knowledge acquisition and evidence propagation. We also discuss two exact and one approximate methods for dealing with general networks. All these algorithms admit distributed implementations.
Efficient Reasoning
 Computing Surveys
, 1998
"... Many tasks require "reasoning"  i.e., deriving conclusions from a corpus of explicitly stored information  to solve their range of problems. An ideal reasoning system would produce alland only the correct answers to every possible query, produce answers that are as specific as possible, be ex ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Many tasks require "reasoning"  i.e., deriving conclusions from a corpus of explicitly stored information  to solve their range of problems. An ideal reasoning system would produce alland only the correct answers to every possible query, produce answers that are as specific as possible, be expressive enough to permit any possible fact to be stored and any possible query to be asked, and be efficient. Unfortunately, this is provably impossible: as correct and precise systems become more expressive, they become increasingly inefficient, or even undecidable. This tutorial first formalizes these hardness results, in the context of both logic and probabilitybased reasoning, then overviews the existing techniques now used to address, or at least sidestep, this dilemma. Throughout, we also include some alternative proposals. 1 Introduction Many information systems use a corpus of explicitly stored information (a.k.a. a "knowledge base", KB) to solve their range of problems. For exa...
Comparision of Multiagent Inference Methods in Multiply Sectioned Bayesian Networks
 International Journal of Approximate Reasoning
, 2003
"... problem domains, many applications are found to be more suitably addressed by multiagent systems. Multiply sectioned Bayesian networks (MSBNs) provide one framework for agents to estimate what is the true state of a domain so that the agents can act accordingly. Existing methods for multiagent infer ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
problem domains, many applications are found to be more suitably addressed by multiagent systems. Multiply sectioned Bayesian networks (MSBNs) provide one framework for agents to estimate what is the true state of a domain so that the agents can act accordingly. Existing methods for multiagent inference in MSBNs are based on linked junction forests (LJFs). The methods are extensions of message passing in junction trees for inference in singleagent Bayesian networks (BNs).
Efficient MultipleDisorder Diagnosis by Strategic Focusing
, 1994
"... The belief network framework is becoming increasingly popular for building diagnostic knowledge based systems. The framework is especially suited for the task of diagnosis because it provides for modelling and dealing with multiple interacting disorders. However, this ability often is exploited ins ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The belief network framework is becoming increasingly popular for building diagnostic knowledge based systems. The framework is especially suited for the task of diagnosis because it provides for modelling and dealing with multiple interacting disorders. However, this ability often is exploited insufficiently due to the computational complexity involved. In this paper, we present a method for multipledisorder diagnosis with a belief network that derives its efficiency from focusing on small sets of related disorders which are constructed by taking advantage of the independencies portrayed by the graphical part of the network. 1 Introduction Although diagnosing multiple disorders has been a longstanding concern of knowledgebased systems research, it is only recently that fundamental paradigms for dealing with multiple disorders have begun to arise. The paradigms of modelbased reasoning [ Reiter, 1987 ] and abductive reasoning [ Peng and Reggia, 1990 ] especially are tuned to mult...
Comparing Alternative Methods for Inference in Multiply Sectioned Bayesian Networks
"... Multiply sectioned Bayesian networks (MSBNs) provide one framework for agents to estimate the state of a domain. Existing methods for multiagent inference in MSBNs are based on linked junction forests (LJFs). The methods are extensions of message passing in junction trees for inference in singleage ..."
Abstract
 Add to MetaCart
Multiply sectioned Bayesian networks (MSBNs) provide one framework for agents to estimate the state of a domain. Existing methods for multiagent inference in MSBNs are based on linked junction forests (LJFs). The methods are extensions of message passing in junction trees for inference in singleagent Bayesian networks (BNs). We consider extending other inference methods in singleagent BNs to multiagent inference in MSBNs. In particular, we consider distributed versions of loop cutset conditioning and forward sampling. They are compared with the LJF method in terms of offline compilation, interagent messages during communication, consistent local inference, and preservation of agent privacy.