Results 1  10
of
12
A Guide to the Literature on Learning Probabilistic Networks From Data
, 1996
"... This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the ..."
Abstract

Cited by 172 (0 self)
 Add to MetaCart
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and includes some overlapping work on more general probabilistic networks. Connections are drawn between the statistical, neural network, and uncertainty communities, and between the different methodological communities, such as Bayesian, description length, and classical statistics. Basic concepts for learning and Bayesian networks are introduced and methods are then reviewed. Methods are discussed for learning parameters of a probabilistic network, for learning the structure, and for learning hidden variables. The presentation avoids formal definitions and theorems, as these are plentiful in the literature, and instead illustrates key concepts with simplified examples. Keywords Bayesian networks, graphical models, hidden variables, learning, learning structure, probabilistic networks, knowledge discovery. I. Introduction Probabilistic networks or probabilistic gra...
Graphical Models for Discovering Knowledge
, 1995
"... There are many different ways of representing knowledge, and for each of these ways there are many different discovery algorithms. How can we compare different representations? How can we mix, match and merge representations and algorithms on new problems with their own unique requirements? This cha ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
There are many different ways of representing knowledge, and for each of these ways there are many different discovery algorithms. How can we compare different representations? How can we mix, match and merge representations and algorithms on new problems with their own unique requirements? This chapter introduces probabilistic modeling as a philosophy for addressing these questions and presents graphical models for representing probabilistic models. Probabilistic graphical models are a unified qualitative and quantitative framework for representing and reasoning with probabilities and independencies. 4.1 Introduction Perhaps one common element of the discovery systems described in this and previous books on knowledge discovery is that they are all different. Since the class of discovery problems is a challenging one, we cannot write a single program to address all of knowledge discovery. The KEFIR discovery system applied to health care by Matheus, PiatetskyShapiro, and McNeill (199...
A Forward Monte Carlo Method for Solving Influence Diagrams Using Local Computation
, 2000
"... The main goal of this paper is to describe a new Monte Carlo method for solving influence diagrams using local computation. We propose a forward Monte Carlo sampling technique that draws independent and identically distributed observations. Methods that have been proposed in this spirit sample from ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
The main goal of this paper is to describe a new Monte Carlo method for solving influence diagrams using local computation. We propose a forward Monte Carlo sampling technique that draws independent and identically distributed observations. Methods that have been proposed in this spirit sample from the entire distribution. However, when the number of variables is large, the state space of all variables is exponentially large, and the sample size required for good estimates may be too large to be practical. In this paper, we develop a forward Monte Carlo method, which generates observations from only a small set of chance variables for each decision node in the influence diagram. We use methods developed for exact solution of influence diagrams to limit the number of chance variables sampled at any time. Because influence diagrams model each chance variable with a conditional probability distribution, the forward Monte Carlo solution method lends itself very well to influencediagram representations.
Three approaches to probability model selection
 In de Mantaras and Poole [160
, 1994
"... relative entropy, EM algorithm. This paper compares three approaches to the problem of selecting among probability models to fit data: (1) use of statistical criteria such as Akaike’s information criterion and Schwarz’s “Bayesian information criterion, ” (2) maximization of the posterior probability ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
relative entropy, EM algorithm. This paper compares three approaches to the problem of selecting among probability models to fit data: (1) use of statistical criteria such as Akaike’s information criterion and Schwarz’s “Bayesian information criterion, ” (2) maximization of the posterior probability of the model, and (3) maximization of an “effectiveness ratio ” trading off accuracy and computational cost. The unifying characteristic of the approaches is that all can be viewed as maximizing a penalized likelihood function. The second approach with suitable prior distributions has been shown to reduce to the first. This paper shows that the third approach reduces to the second for a particular form of the effectiveness ratio, and illustrates all three approaches with the problem of selecting the number of components in a mixture of Gaussian distributions. Unlike the first two approaches, the third can be used even when the candidate models are chosen for computational efficiency, without regard to physical interpretation, so that the likelihoods and the prior distribution over models cannot be interpreted literally. As the most general and computationally oriented of the approaches, it is especially useful for artificial intelligence applications. 1
riso: An Implementation of Distributed Belief Networks
 In Proc. AAAI Symposium on AI in Equipment Service
, 1999
"... This paper describes riso, an implementation of distributed belief network software. Distributed belief networks are a natural extension of ordinary belief networks in which the belief network is composed of subnetworks running on separate processors. In keeping with the distributed computational mo ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper describes riso, an implementation of distributed belief network software. Distributed belief networks are a natural extension of ordinary belief networks in which the belief network is composed of subnetworks running on separate processors. In keeping with the distributed computational model, no single processor has information about the structure of the entire distributed belief network, and inferences are to be computed using only local quantities. A general policy is proposed for publishing information as belief networks. A modeling language for the representation of distributed belief networks has been devised, and software has been implemented to compile the modeling language and carry out inferences. Belief networks may contain arbitrary conditional distributions, and new types of distributions can be defined without modifying the existing inference software. In inference, an exact result is computed if a rule is known for combining incoming partial results, and if an ...
Tools for Unified Prediction and Diagnosis in HVAC Systems: The RISO Project
"... This report describes the riso project, a system for unified prediction and diagnosis in HVAC systems based on a class of probabilistic models called belief networks. Progress has been made in both theoretical and practical problems: a scheme for the representation of belief networks with heterogene ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This report describes the riso project, a system for unified prediction and diagnosis in HVAC systems based on a class of probabilistic models called belief networks. Progress has been made in both theoretical and practical problems: a scheme for the representation of belief networks with heterogeneous conditional distributions has been devised, an algorithm for inference in a polytree network with arbitrary distributions has been implemented, and software for distributed belief networks has been implemented. After reviewing the motivation for the use of belief networks, the heterogeneous polytree algorithm is described and several important details are discussed. Distributed belief networks as a framework for reasoning under uncertainty in functionally and geographically distributed systems are then described. Several interesting questions arise in connection with distributed belief networks, such as the control of communication, publishing information in probabilistic form, and copin...
Solving Hybrid Influence Diagrams with Deterministic Variables
"... We describe a framework and an algorithm for solving hybrid influence diagrams with discrete, continuous, and deterministic chance variables, and discrete and continuous decision variables. A continuous chance variable in an influence diagram is said to be deterministic if its conditional distributi ..."
Abstract
 Add to MetaCart
We describe a framework and an algorithm for solving hybrid influence diagrams with discrete, continuous, and deterministic chance variables, and discrete and continuous decision variables. A continuous chance variable in an influence diagram is said to be deterministic if its conditional distributions have zero variances. The solution algorithm is an extension of Shenoy’s fusion algorithm for discrete influence diagrams. We describe an extended ShenoyShafer architecture for propagation of discrete, continuous, and utility potentials in hybrid influence diagrams that include deterministic chance variables. The algorithm and framework are illustrated by solving two small examples. 1
Solving CLQG Influence Diagrams Using ArcReversal Operations in a Strong Junction Tree
"... This paper presents an architecture for solving conditional linearquadratic Gaussian (CLQG) influence diagrams (IDs) by Lazy Propagation (LP). A strong junction tree (SJT) is used to guide the elimination order whereas the marginalization operation is based on arcreversal (AR). The use of AR for m ..."
Abstract
 Add to MetaCart
This paper presents an architecture for solving conditional linearquadratic Gaussian (CLQG) influence diagrams (IDs) by Lazy Propagation (LP). A strong junction tree (SJT) is used to guide the elimination order whereas the marginalization operation is based on arcreversal (AR). The use of AR for marginalization simplifies the implementation and gives the architecture a number of advantages. The key benefits of using LP in combination with AR to solve CLQG IDs are illustrated by examples and in experiments. The results of a preliminary performance evaluation are promising. 1
An Algorithm for Inferences in a Polytree with Heterogeneous Conditional Distributions
"... This paper describes a general scheme for accomodating different types of conditional distributions in a Bayesian network. The algorithm is based on the polytree algorithm for Bayesian network inference, in which “messages” (probability distributions and likelihood functions) are computed. The poste ..."
Abstract
 Add to MetaCart
This paper describes a general scheme for accomodating different types of conditional distributions in a Bayesian network. The algorithm is based on the polytree algorithm for Bayesian network inference, in which “messages” (probability distributions and likelihood functions) are computed. The posterior for a given variable depends on the messages sent to it by its parents and children, if any. In this scheme, an exact result is computed if such a result is known for the incoming messages, otherwise an approximation is computed, which is a mixture of Gaussians. The approximation may then be propagated to other variables. Approximations for likelihood functions (λmessages) are not computed; the approximation step is put off until the likelihood function is combined with a probability distribution — this avoids certain numerical difficulties. In contrast with standard polytree algorithms, which can only accomodate distributions of a few types at most, this heterogeneous polytree algorithm can, in principle, handle any kind of continuous or discrete conditional distribution. With standard algorithms, it is necessary to construct an approximate Bayesian network, in which one then computes exact results; the heterogeneous polytree algorithm, on the other hand, computes approximate results in the original Bayesian network. The most important advantage of the new algorithm is that the Bayesian network can be directly represented using the conditional distributions most appropriate for the problem domain.
Unified Prediction and Diagnosis in Engineering Systems by means of Distributed Belief
, 1999
"... Dissertation directed by Professor Jan F. Kreider This dissertation describes the theory, implementation, and application of a class of graphical probability models, called distributed belief networks, for the purposes of prediction, diagnosis, and calculation of the value of information in engineer ..."
Abstract
 Add to MetaCart
Dissertation directed by Professor Jan F. Kreider This dissertation describes the theory, implementation, and application of a class of graphical probability models, called distributed belief networks, for the purposes of prediction, diagnosis, and calculation of the value of information in engineering systems. Probability models have the very desirable property that several useful operations can be stated as the computation of probability distributions; prediction and diagnosis correspond to the calculation of certain posterior distributions, and the value of information can be interpreted as the calculation of an average decrease of entropy of a posterior distribution. These operations, and others, are different ways of looking at a single model — there is no need for separate models for different operations. A belief network, for the purposes of this dissertation, is a directed graph associated with a set of conditional probability distributions. Each node in the graph corresponds to a variable in a probability model. A distributed belief network is a belief network implemented on multiple processors. Posterior distributions for variables in the belief network are computed by a messagepassing algorithm called the polytree algorithm. For