Results 1  10
of
1,140
Repository CRAN
, 2012
"... Description The R package BDgraph is a statistical tool for Bayesian model determination in undirected Gaussian graphical models. The Bayesian methodology is based on birthdeath MCMC algorithm. The main function is ’bdmcmc ’ which is a birthdeath MCMC algorithm for Bayesian model determination in ..."
Abstract
 Add to MetaCart
Description The R package BDgraph is a statistical tool for Bayesian model determination in undirected Gaussian graphical models. The Bayesian methodology is based on birthdeath MCMC algorithm. The main function is ’bdmcmc ’ which is a birthdeath MCMC algorithm for Bayesian model determination
Operations for Learning with Graphical Models
 Journal of Artificial Intelligence Research
, 1994
"... This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models ..."
Abstract

Cited by 276 (13 self)
 Add to MetaCart
This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models
Extended Message Passing Algorithm for Inference in Loopy Gaussian Graphical Models
, 2002
"... We consider message passing for probabilistic inference in undirected Gaussian graphical models. We show that for singly connected graphs, message passing yields an algorithm that is equivalent to the application of Gaussian elimination to the solution of a particular system of equations. This relat ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
We consider message passing for probabilistic inference in undirected Gaussian graphical models. We show that for singly connected graphs, message passing yields an algorithm that is equivalent to the application of Gaussian elimination to the solution of a particular system of equations
Variational algorithms for approximate Bayesian inference
, 2003
"... The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents ..."
Abstract

Cited by 440 (9 self)
 Add to MetaCart
a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms. Chapter 2 forms
Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2008
"... We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added ℓ1norm penalty term. The problem as formulated is convex but the memor ..."
Abstract

Cited by 334 (2 self)
 Add to MetaCart
We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added ℓ1norm penalty term. The problem as formulated is convex
Correctness of belief propagation in Gaussian graphical models of arbitrary topology
 NEURAL COMPUTATION
, 1999
"... Local "belief propagation" rules of the sort proposed byPearl [12] are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstrated good performance of "loopy belief propagation&q ..."
Abstract

Cited by 296 (7 self)
 Add to MetaCart
Local "belief propagation" rules of the sort proposed byPearl [12] are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstrated good performance of "loopy belief propagation
Model selection and estimation in the Gaussian graphical model
 BIOMETRIKA (2007), PP. 1–17
, 2007
"... ..."
A Variational Bayesian Framework for Graphical Models
 In Advances in Neural Information Processing Systems 12
, 2000
"... This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model parameters and structures, as well as latent variables, in an analytical manner. These posteriors ..."
Abstract

Cited by 267 (7 self)
 Add to MetaCart
This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model parameters and structures, as well as latent variables, in an analytical manner. These posteriors
Model Selection in Undirected Graphical Models with the Elastic Net
, 2010
"... Structure learning in random fields has attracted considerable attention due to its difficulty and importance in areas such as remote sensing, computational biology, natural language processing, protein networks, and social network analysis. We consider the problem of estimating the probabilistic ..."
Abstract
 Add to MetaCart
bilistic graph structure associated with a Gaussian Markov Random Field (GMRF), the Ising model and the Potts model, by extending previous work on l1 regularized neighborhood estimation to include the elastic net l1 + l2 penalty. Additionally, we show numerical evidence that the edge density plays a role
Modeling changing dependency structure in multivariate time series
 In International Conference in Machine Learning
, 2007
"... We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmenta ..."
Abstract

Cited by 47 (0 self)
 Add to MetaCart
We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP
Results 1  10
of
1,140