Results 11  20
of
115
Stable Local Computation with Conditional Gaussian Distributions
 Statistics and Computing
, 1999
"... : This article describes a propagation scheme for Bayesian networks with conditional Gaussian distributions that does not have the numerical weaknesses of the scheme derived in Lauritzen (1992). The propagation architecture is that of Lauritzen and Spiegelhalter (1988). In addition to the means and ..."
Abstract

Cited by 60 (0 self)
 Add to MetaCart
: This article describes a propagation scheme for Bayesian networks with conditional Gaussian distributions that does not have the numerical weaknesses of the scheme derived in Lauritzen (1992). The propagation architecture is that of Lauritzen and Spiegelhalter (1988). In addition to the means and variances provided by the previous algorithm, the new propagation scheme yields full local marginal distributions. The new scheme also handles linear deterministic relationships between continuous variables in the network specification. The new propagation scheme is in many ways faster and simpler than previous schemes and the method has been implemented in the most recent version of the HUGIN software. Key words: Artificial intelligence, Bayesian networks, CG distributions, Gaussian mixtures, probabilistic expert systems, propagation of evidence. 1 Introduction Bayesian networks have developed into an important tool for building systems for decision support in environments characterized by...
Expectation propagation for approximate inference in dynamic Bayesian networks
 In Proceedings UAI
, 2002
"... We describe expectation propagation for approximate inference in dynamic Bayesian networks as a natural extension of Pearl's exact belief propagation. Expectation propagation is a greedy algorithm, converges in many practical cases, but not always. We derive a doubleloop algorithm, guaranteed to co ..."
Abstract

Cited by 52 (10 self)
 Add to MetaCart
We describe expectation propagation for approximate inference in dynamic Bayesian networks as a natural extension of Pearl's exact belief propagation. Expectation propagation is a greedy algorithm, converges in many practical cases, but not always. We derive a doubleloop algorithm, guaranteed to converge to a local minimum of a Bethe free energy. Furthermore, we show that stable fixed points of (damped) expectation propagation correspond to local minima of this free energy, but that the converse need not be the case. We illustrate the algorithms by applying them to switching linear dynamical systems and discuss implications for approximate inference in general Bayesian networks.
Hybrid Bayesian Networks for Reasoning about Complex Systems
, 2002
"... Many realworld systems are naturally modeled as hybrid stochastic processes, i.e., stochastic processes that contain both discrete and continuous variables. Examples include speech recognition, target tracking, and monitoring of physical systems. The task is usually to perform probabilistic inferen ..."
Abstract

Cited by 48 (0 self)
 Add to MetaCart
Many realworld systems are naturally modeled as hybrid stochastic processes, i.e., stochastic processes that contain both discrete and continuous variables. Examples include speech recognition, target tracking, and monitoring of physical systems. The task is usually to perform probabilistic inference, i.e., infer the hidden state of the system given some noisy observations. For example, we can ask what is the probability that a certain word was pronounced given the readings of our microphone, what is the probability that a submarine is trying to surface given our sonar data, and what is the probability of a valve being open given our pressure and flow readings. Bayesian networks are
Blocking Gibbs Sampling in Very Large Probabilistic Expert Systems
 Internat. J. Humanâ€“Computer Studies
, 1995
"... We introduce a methodology for performing approximate computations in very complex probabilistic systems (e.g. huge pedigrees). Our approach, called blocking Gibbs, combines exact local computations with Gibbs sampling in a way that complements the strengths of both. The methodology is illustrate ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
We introduce a methodology for performing approximate computations in very complex probabilistic systems (e.g. huge pedigrees). Our approach, called blocking Gibbs, combines exact local computations with Gibbs sampling in a way that complements the strengths of both. The methodology is illustrated on a realworld problem involving a heavily inbred pedigree containing 20;000 individuals. We present results showing that blockingGibbs sampling converges much faster than plain Gibbs sampling for very complex problems.
A variational approximation for Bayesian networks with discrete and continuous latent variables
 In UAI
, 1999
"... We show how to use a variational approximation to the logistic function to perform approximate inference in Bayesian networks containing discrete nodes with continuous parents. Essentially, we convert the logistic function to a Gaussian, which facilitates exact inference, and then iteratively adjust ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
We show how to use a variational approximation to the logistic function to perform approximate inference in Bayesian networks containing discrete nodes with continuous parents. Essentially, we convert the logistic function to a Gaussian, which facilitates exact inference, and then iteratively adjust the variational parameters to improve the quality of the approximation. We demonstrate experimentally that this approximation is much faster than sampling, but comparable in accuracy. We also introduce a simple new technique for handling evidence, which allows us to handle arbitrary distributionson observed nodes, as well as achieving a significant speedup in networks with discrete variables of large cardinality. 1
Local Computation with Valuations from a Commutative Semigroup
 Annals of Mathematics and Artificial Intelligence
, 1996
"... This paper studies a variant of axioms originally developed by Shafer and Shenoy (1988). It is investigated which extra assumptions are needed to perform the local computations in a HUGINlike architecture (Jensen et al. 1990) or in the architecture of Lauritzen and Spiegelhalter (1988). In particul ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
This paper studies a variant of axioms originally developed by Shafer and Shenoy (1988). It is investigated which extra assumptions are needed to perform the local computations in a HUGINlike architecture (Jensen et al. 1990) or in the architecture of Lauritzen and Spiegelhalter (1988). In particular it is shown that propagation of belief functions can be performed in these architectures. Keywords: articial intelligence, belief function, constraint propagation, expert system, probability propagation, valuationbased system. 1 Introduction An important development in articial intelligence is associated with an abstract theory of local computation known as the Shafer{Shenoy axioms (Shafer and Shenoy 1988; Shenoy and Shafer 1990). These describe in a very general setting how computations can be performed eciently and locally in a variety of problems, just if a few simple conditions are satised. Even though the axioms were developed to formalize computation with belief functions (Shaf...
A Probabilistic Expert System for Automatic Musical Accompaniment
 Journal of Computational and Graphical Statistics
, 1999
"... A methodology is presented that allows a computer to play the role of musical accompanist in a nonimprovised musical composition for soloist and accompaniment. The modeling of the accompaniment incorporates a number of distinct knowledge sources including timing information extracted in realtime f ..."
Abstract

Cited by 29 (5 self)
 Add to MetaCart
A methodology is presented that allows a computer to play the role of musical accompanist in a nonimprovised musical composition for soloist and accompaniment. The modeling of the accompaniment incorporates a number of distinct knowledge sources including timing information extracted in realtime from the soloist's acoustic signal, an understanding of the soloist's interpretation learned from rehearsals, and prior knowledge that guides the accompaniment toward musically plausible renditions. The solo and accompaniment parts are represented collectively as a large number of Gaussian random variables with a specified conditional independence structure  a Bayesian Belief Network. Within this framework a principled and computationally feasible method for generating realtime accompaniment is presented that incorporates the relevant knowledge sources. The EM algorithm is used to adapt the accompaniment to the soloist's interpretation through a series of rehearsals. A demonstration is provided from J.S. Bach's Cantata 12.
Learning mixtures of DAG models
, 1997
"... We describe computationally efficient methods for learning mixtures in which each component is a directed acyclic graphical model (mixtures of DAGs or MDAGs). We argue that simple searchandscore algorithms are infeasible for a variety of problems, and introduce a feasible approach in which paramet ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
We describe computationally efficient methods for learning mixtures in which each component is a directed acyclic graphical model (mixtures of DAGs or MDAGs). We argue that simple searchandscore algorithms are infeasible for a variety of problems, and introduce a feasible approach in which parameter and structure search is interleaved and expected data is treated as real data. Our approach can be viewed as a combination of (1) the Cheesemanâ€“Stutz asymptotic approximation for model posterior probability and (2) the Expectationâ€“Maximization algorithm. We evaluate our procedure for selecting among MDAGs on synthetic and real examples. 1
Inference and Learning in Hybrid Bayesian Networks
, 1998
"... We survey the literature on methods for inference and learning in Bayesian Networks composed of discrete and continuous nodes, in which the continuous nodes have a multivariate Gaussian distribution, whose mean and variance depends on the values of the discrete nodes. We also briefly consider hybrid ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
We survey the literature on methods for inference and learning in Bayesian Networks composed of discrete and continuous nodes, in which the continuous nodes have a multivariate Gaussian distribution, whose mean and variance depends on the values of the discrete nodes. We also briefly consider hybrid Dynamic Bayesian Networks, an extension of switching Kalman filters. This report is meant to summarize what is known at a sufficient level of detail to enable someone to implement the algorithms, but without dwelling on formalities.
Automated Rhythm Transcription
 In Proc. Int. Symposium on Music Inform. Retriev. (ISMIR
, 2001
"... We present a technique that, given a sequence of musical note onset times, performs simultaneous identification of the norated rhythm and the variable tempo associated with the times. Our formulation is probabilistic: We develop a stochastic model for the interconnected evolution of a rhythm process ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
We present a technique that, given a sequence of musical note onset times, performs simultaneous identification of the norated rhythm and the variable tempo associated with the times. Our formulation is probabilistic: We develop a stochastic model for the interconnected evolution of a rhythm process, a tempo process, and an observable process. This model allows the globally optimal identification of the most likely rhythm and tempo sequence, given the observed onset times. We demonstrate applications to a sequence of times derived from a sampled audio file and to MIDI data.