Results 1  10
of
30
Objective Bayesian model selection in Gaussian graphical models
, 2007
"... This paper presents a default modelselection procedure for Gaussian graphical models that involves two new developments. First, we develop a default version of the hyperinverse Wishart prior for restricted covariance matrices, called the hyperinverse Wishart gprior, and show how it corresponds t ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
(Show Context)
This paper presents a default modelselection procedure for Gaussian graphical models that involves two new developments. First, we develop a default version of the hyperinverse Wishart prior for restricted covariance matrices, called the hyperinverse Wishart gprior, and show how it corresponds to the implied fractional prior for covariance selection using fractional Bayes factors. Second, we apply a class of priors that automatically handles the problem of multiple hypothesis testing implied by covariance selection. We demonstrate our methods on a variety of simulated examples, concluding with a real example analysing covariation in mutualfund returns. These studies reveal that the combined use of a multiplicitycorrection prior on graphs and fractional Bayes factors for computing marginal likelihoods yields better performance than existing Bayesian methods. Some key words: covariance selection; hyperinverse Wishart distribution; fractional Bayes factors; Bayesian model selection; multiple hypothesis testing.
Dynamic matrixvariate graphical models
 Bayesian Anal
, 2007
"... This paper introduces a novel class of Bayesian models for multivariate time series analysis based on a synthesis of dynamic linear models and graphical models. The models are then applied in the context of financial time series for predictive portfolio analysis providing a significant improvement i ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
This paper introduces a novel class of Bayesian models for multivariate time series analysis based on a synthesis of dynamic linear models and graphical models. The models are then applied in the context of financial time series for predictive portfolio analysis providing a significant improvement in performance of optimal investment decisions.
Bayesian analysis of matrix normal graphical models
 Biometrika
, 2009
"... We develop Bayesian analysis of matrixvariate normal data with conditional independence graphical structuring of the characterising variance matrix parameters. This leads to fully Bayesian analysis of matrix normal graphical models, including discussion of novel prior specifications, the resulting ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
We develop Bayesian analysis of matrixvariate normal data with conditional independence graphical structuring of the characterising variance matrix parameters. This leads to fully Bayesian analysis of matrix normal graphical models, including discussion of novel prior specifications, the resulting problems of posterior computation addressed using Markov chain Monte Carlo methods, and graphical model assessment that involves approximate evaluation of marginal likelihood functions under specified graphical models. Modelling and inference for spatial/image data via a novel class of Markov random fields that arise as natural examples of matrix normal graphical models is discussed. This is complemented by the development of a broad class of dynamic models for matrixvariate time series within which stochastic elements defining time series errors and structural changes over time are subject to graphical model structuring. Three examples illustrate these developments and highlight questions of graphical model uncertainty and comparison in matrix data contexts.
Sparse covariance estimation in heterogeneous samples
, 2011
"... Standard Gaussian graphical models implicitly assume that the conditional independence among variables is common to all observations in the sample. However, in practice, observations are usually collected from heterogeneous populations where such an assumption is not satisfied, leading in turn to no ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Standard Gaussian graphical models implicitly assume that the conditional independence among variables is common to all observations in the sample. However, in practice, observations are usually collected from heterogeneous populations where such an assumption is not satisfied, leading in turn to nonlinear relationships among variables. To address such situations we explore mixtures of Gaussian graphical models; in particular, we consider both infinite mixtures and infinite hidden Markov models where the emission distributions correspond to Gaussian graphical models. Such models allow us to divide a heterogeneous population into homogenous groups, with each cluster having its own conditional independence structure. As an illustration, we study the trends in foreign exchange rate fluctuations in the preEuro era.
Geometric Representations of Hypergraphs for Prior Specification and Posterior Sampling
, 2009
"... A parametrization of hypergraphs based on the geometry of points in R d is developed. Informative prior distributions on hypergraphs are induced through this parametrization by priors on point configurations via spatial processes. This prior specification is used to infer conditional independence m ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
A parametrization of hypergraphs based on the geometry of points in R d is developed. Informative prior distributions on hypergraphs are induced through this parametrization by priors on point configurations via spatial processes. This prior specification is used to infer conditional independence models or Markov structure of multivariate distributions. Specifically, we can recover both the junction tree factorization as well as the hyper Markov law. This approach offers greater control on the distribution of graph features than ErdösRényi random graphs, supports inference of factorizations that cannot be retrieved by a graph alone, and leads to new Metropolis/Hastings Markov chain Monte Carlo algorithms with both local and global moves in graph space. We illustrate the utility of this parametrization and prior specification using simulations.
Bayesian structural learning and estimation in Gaussian graphical models
"... We propose a new stochastic search algorithm for Gaussian graphical models called the mode oriented stochastic search. Our algorithm relies on the existence of a method to accurately and efficiently approximate the marginal likelihood associated with a graphical model when it cannot be computed in c ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We propose a new stochastic search algorithm for Gaussian graphical models called the mode oriented stochastic search. Our algorithm relies on the existence of a method to accurately and efficiently approximate the marginal likelihood associated with a graphical model when it cannot be computed in closed form. To this end, we develop a new Laplace approximation method to the normalizing constant of a GWishart distribution. We show that combining the mode oriented stochastic search with our marginal likelihood estimation method leads to excellent results with respect to other techniques discussed in the literature. We also describe how to perform inference through Bayesian model averaging based on the reduced set of graphical models identified. Finally, we give a novel stochastic search technique for multivariate regression models.
POSTERIOR CONVERGENCE RATES FOR ESTIMATING LARGE PRECISION MATRICES USING GRAPHICAL MODELS
 SUBMITTED TO THE ANNALS OF STATISTICS
"... We consider Bayesian estimation of a p×p precision matrix, where p can be much larger than the available sample size n. It is well known that consistent estimation in such an ultrahigh dimensional situation requires regularization such as banding, tapering or thresholding. We consider a banding str ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We consider Bayesian estimation of a p×p precision matrix, where p can be much larger than the available sample size n. It is well known that consistent estimation in such an ultrahigh dimensional situation requires regularization such as banding, tapering or thresholding. We consider a banding structure in the model and induce a prior distribution on a banded precision matrix through a Gaussian graphical model, where an edge is present only when two vertices are within a given distance. We show that under a very mild growth condition and a proper choice of the order of graph, the posterior distribution based on the graphical model is consistent in the L∞operator norm uniformly over a class of precision matrices, even if the true precision matrix may not have a banded structure. Along the way to the proof, we also establish that the maximum likelihood estimator (MLE) is also consistent under the same set of condition, which is of independent interest. We also conduct a simulation study to compare finite sample performance of the Bayes estimator and the MLE based on the graphical model with that obtained by using a banding operation on the sample covariance matrix. We observe that the graphical model based estimators perform significantly better, especially if the banded sample covariance matrix is not positive definite. In contrast, the graphical model based estimators are always positive definite. Finally, we discuss a practical method of choosing the order of the graphical model using the marginal likelihood function.
EXACT FORMULAS FOR THE NORMALIZING CONSTANTS OF WISHART DISTRIBUTIONS FOR GRAPHICAL MODELS
"... Gaussian graphical models have received considerable attention during the past four decades from the statistical and machine learning communities. In Bayesian treatments of this model, the GWishart distribution serves as the conjugate prior for inverse covariance matrices satisfying graphical cons ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Gaussian graphical models have received considerable attention during the past four decades from the statistical and machine learning communities. In Bayesian treatments of this model, the GWishart distribution serves as the conjugate prior for inverse covariance matrices satisfying graphical constraints. While it is straightforward to posit the unnormalized densities, the normalizing constants of these distributions have been known only for graphs that are chordal, or decomposable. Up until now, it was unknown whether the normalizing constant for a general graph could be represented explicitly, and a considerable body of computational literature emerged that attempted to avoid this apparent intractability. We close this question by providing an explicit representation of the GWishart normalizing constant for general graphs.