Results 1  10
of
11
Sparse graphical models for exploring gene expression data
 Journal of Multivariate Analysis
, 2004
"... DMS0112069. Any opinions, findings, and conclusions or recommendations expressed in this material are ..."
Abstract

Cited by 133 (22 self)
 Add to MetaCart
DMS0112069. Any opinions, findings, and conclusions or recommendations expressed in this material are
Decomposable Graphical Gaussian Model Determination
, 1999
"... We propose a methodology for Bayesian model determination in decomposable graphical gaussian models. To achieve this aim we consider a hyper inverse Wishart prior distribution on the concentration matrix for each given graph. To ensure compatibility across models, such prior distributions are obt ..."
Abstract

Cited by 67 (11 self)
 Add to MetaCart
We propose a methodology for Bayesian model determination in decomposable graphical gaussian models. To achieve this aim we consider a hyper inverse Wishart prior distribution on the concentration matrix for each given graph. To ensure compatibility across models, such prior distributions are obtained by marginalisation from the prior conditional on the complete graph. We explore alternative structures for the hyperparameters of the latter, and their consequences for the model. Model determination is carried out by implementing a reversible jump MCMC sampler. In particular, the dimensionchanging move we propose involves adding or dropping an edge from the graph. We characterise the set of moves which preserve the decomposability of the graph, giving a fast algorithm for maintaining the junction tree representation of the graph at each sweep. As state variable, we propose to use the incomplete variancecovariance matrix, containing only the elements for which the correspondi...
Objective Bayesian model selection in Gaussian graphical models
, 2007
"... This paper presents a default modelselection procedure for Gaussian graphical models that involves two new developments. First, we develop a default version of the hyperinverse Wishart prior for restricted covariance matrices, called the hyperinverse Wishart gprior, and show how it corresponds t ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
This paper presents a default modelselection procedure for Gaussian graphical models that involves two new developments. First, we develop a default version of the hyperinverse Wishart prior for restricted covariance matrices, called the hyperinverse Wishart gprior, and show how it corresponds to the implied fractional prior for covariance selection using fractional Bayes factors. Second, we apply a class of priors that automatically handles the problem of multiple hypothesis testing implied by covariance selection. We demonstrate our methods on a variety of simulated examples, concluding with a real example analysing covariation in mutualfund returns. These studies reveal that the combined use of a multiplicitycorrection prior on graphs and fractional Bayes factors for computing marginal likelihoods yields better performance than existing Bayesian methods. Some key words: covariance selection; hyperinverse Wishart distribution; fractional Bayes factors; Bayesian model selection; multiple hypothesis testing.
Bayesian inference for nondecomposable graphical Gaussian models
 Sankhya, Ser. A
, 2003
"... In this paper we propose a method to calculate the posterior probability of a nondecomposable graphical Gaussian model. Our proposal is based on a new device to sample from Wishart distributions, conditional on the graphical constraints. As a result, our methodology allows Bayesian model selection w ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
In this paper we propose a method to calculate the posterior probability of a nondecomposable graphical Gaussian model. Our proposal is based on a new device to sample from Wishart distributions, conditional on the graphical constraints. As a result, our methodology allows Bayesian model selection within the whole class of graphical Gaussian models, including nondecomposable ones. 1 INTRODUCTION Let G be a conditional independence graph, describing the association structure of a vector of random variables, say X. A graphical model is a family of probability distributions P G which is Markov over G. In particular, when all the random variables in X are continuous, a graphical Gaussian model is obtained by assuming P G = N(¯; \Sigma G ), with \Sigma G positive definite and such that P G is Markov over G. For an introduction to graphical models, see for instance Lauritzen (1996) or Whittaker (1990). Typically the association structure of X is uncertain and, thus, has to be inferred from...
Bayesian analysis of matrix normal graphical models
 Biometrika
, 2009
"... We develop Bayesian analysis of matrixvariate normal data with conditional independence graphical structuring of the characterising variance matrix parameters. This leads to fully Bayesian analysis of matrix normal graphical models, including discussion of novel prior specifications, the resulting ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
We develop Bayesian analysis of matrixvariate normal data with conditional independence graphical structuring of the characterising variance matrix parameters. This leads to fully Bayesian analysis of matrix normal graphical models, including discussion of novel prior specifications, the resulting problems of posterior computation addressed using Markov chain Monte Carlo methods, and graphical model assessment that involves approximate evaluation of marginal likelihood functions under specified graphical models. Modelling and inference for spatial/image data via a novel class of Markov random fields that arise as natural examples of matrix normal graphical models is discussed. This is complemented by the development of a broad class of dynamic models for matrixvariate time series within which stochastic elements defining time series errors and structural changes over time are subject to graphical model structuring. Three examples illustrate these developments and highlight questions of graphical model uncertainty and comparison in matrix data contexts.
Bayesian covariance matrix estimation using a mixture of decomposable graphical models. Unpublished manuscript
, 2005
"... Summary. Estimating a covariance matrix efficiently and discovering its structure are important statistical problems with applications in many fields. This article takes a Bayesian approach to estimate the covariance matrix of Gaussian data. We use ideas from Gaussian graphical models and model sele ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Summary. Estimating a covariance matrix efficiently and discovering its structure are important statistical problems with applications in many fields. This article takes a Bayesian approach to estimate the covariance matrix of Gaussian data. We use ideas from Gaussian graphical models and model selection to construct a prior for the covariance matrix that is a mixture over all decomposable graphs, where a graph means the configuration of nonzero offdiagonal elements in the inverse of the covariance matrix. Our prior for the covariance matrix is such that the probability of each graph size is specified by the user and graphs of equal size are assigned equal probability. Most previous approaches assume that all graphs are equally probable. We give empirical results that show the prior that assigns equal probability over graph sizes outperforms the prior that assigns equal probability over all graphs, both in identifying the correct decomposable graph and in more efficiently estimating the covariance matrix. The advantage is greatest when the number of observations is small relative to the dimension of the covariance matrix. Our method requires the number of decomposable graphs for each graph size. We show how to estimate these numbers using simulation and that the simulation results agree with analytic results when such results are known. We also show how
Bayesian covariance selection
 ISDS Discussion Paper
, 2004
"... We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic gr ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic graphs (Bayesian networks). We show how to produce and explore the corresponding association networks by Bayesian model averaging across the models identified. We illustrate the use of HdBCS with an example from a largescale gene expression study of breast cancer.
Handling Manipulated Evidence
, 2004
"... Bayesian Networks have been advocated as a useful tool to describe the relations of dependence/independence among random variables and relevant hypotheses in a crime case. Moreover, they have been applied to help the investigator structure the problem and weight the observed evidence, typically with ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Bayesian Networks have been advocated as a useful tool to describe the relations of dependence/independence among random variables and relevant hypotheses in a crime case. Moreover, they have been applied to help the investigator structure the problem and weight the observed evidence, typically with respect to the hypothesis of guilt of a suspect. In this paper we describe a model to handle the possibility that one or more pieces of evidence have been manipulated in order to mislead the investigations. This method is based on causal inference models, although it is developed in a different, specific framework.
© Institute of Mathematical Statistics, 2005 Experiments in Stochastic Computation for HighDimensional Graphical Models
"... Abstract. We discuss the implementation, development and performance of methods of stochastic computation in Gaussian graphical models. We view these methods from the perspective of highdimensional model search, with a particular interest in the scalability with dimension of Markov chain Monte Carl ..."
Abstract
 Add to MetaCart
Abstract. We discuss the implementation, development and performance of methods of stochastic computation in Gaussian graphical models. We view these methods from the perspective of highdimensional model search, with a particular interest in the scalability with dimension of Markov chain Monte Carlo (MCMC) and other stochastic search methods. After reviewing the structure and context of undirected Gaussian graphical models and model uncertainty (covariance selection), we discuss prior specifications, including new priors over models, and then explore a number of examples using various methods of stochastic computation. Traditional MCMC methods are the point of departure for this experimentation; we then develop alternative stochastic search ideas and contrast this new approach with MCMC. Our examples range from low (12–20) to moderate (150) dimension, and combine simple synthetic examples with data analysis from gene expression studies. We conclude with comments about the need and potential for new computational methods in far higher dimensions, including constructive approaches