Results 1  10
of
19
Multiple testing and error control in Gaussian graphical model selection
 Statistical Science
"... Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of cond ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of conditional independences that is imposed on the variables ’ joint distribution. Focusing on Gaussian models, we review classical graphical models. For these models the defining conditional independences are equivalent to vanishing of certain (partial) correlation coefficients associated with individual edges that are absent from the graph. Hence, Gaussian graphical model selection can be performed by multiple testing of hypotheses about vanishing (partial) correlation coefficients. We show and exemplify how this approach allows one to perform model selection while controlling error rates for incorrect edge inclusion. Key words and phrases: Acyclic directed graph, Bayesian network, bidirected graph, chain graph, concentration graph, covariance graph, DAG, graphical model, multiple testing, undirected graph. 1.
Bayesian covariance matrix estimation using a mixture of decomposable graphical models. Unpublished manuscript
, 2005
"... Summary. Estimating a covariance matrix efficiently and discovering its structure are important statistical problems with applications in many fields. This article takes a Bayesian approach to estimate the covariance matrix of Gaussian data. We use ideas from Gaussian graphical models and model sele ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Summary. Estimating a covariance matrix efficiently and discovering its structure are important statistical problems with applications in many fields. This article takes a Bayesian approach to estimate the covariance matrix of Gaussian data. We use ideas from Gaussian graphical models and model selection to construct a prior for the covariance matrix that is a mixture over all decomposable graphs, where a graph means the configuration of nonzero offdiagonal elements in the inverse of the covariance matrix. Our prior for the covariance matrix is such that the probability of each graph size is specified by the user and graphs of equal size are assigned equal probability. Most previous approaches assume that all graphs are equally probable. We give empirical results that show the prior that assigns equal probability over graph sizes outperforms the prior that assigns equal probability over all graphs, both in identifying the correct decomposable graph and in more efficiently estimating the covariance matrix. The advantage is greatest when the number of observations is small relative to the dimension of the covariance matrix. Our method requires the number of decomposable graphs for each graph size. We show how to estimate these numbers using simulation and that the simulation results agree with analytic results when such results are known. We also show how
Modeling uncertainty in macroeconomic growth determinants using Gaussian graphical models
, 2009
"... Model uncertainty has become a central focus of policy discussion surrounding the determinants of economic growth. Over 140 regressors have been employed in growth empirics due to the proliferation of several new growth theories in the past two decades. Recently Bayesian model averaging (BMA) has be ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Model uncertainty has become a central focus of policy discussion surrounding the determinants of economic growth. Over 140 regressors have been employed in growth empirics due to the proliferation of several new growth theories in the past two decades. Recently Bayesian model averaging (BMA) has been employed to address model uncertainty and to provide clear policy implications by identifying robust growth determinants. The BMA approaches were, however, limited to linear regression models that abstract from possible dependencies embedded in the covariance structures of growth determinants. The recent empirical growth literature has developed jointness measures to highlight such dependencies. We address model uncertainty and covariate dependencies in a comprehensive Bayesian framework that allows for structural learning in linear regressions and Gaussian graphical models. A common prior specification across the entire comprehensive framework provides consistency. Gaussian graphical models allow for a principled analysis of dependency structures, which allows us to generate a much more parsimonious set of fundamental growth determinants. Our empirics are based on a prominent growth dataset with 41 potential economic factors that has been utilized in numerous previous analyses to account for model uncertainty as well as jointness.
Bayesian structural learning and estimation in Gaussian graphical models
"... We propose a new stochastic search algorithm for Gaussian graphical models called the mode oriented stochastic search. Our algorithm relies on the existence of a method to accurately and efficiently approximate the marginal likelihood associated with a graphical model when it cannot be computed in c ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
We propose a new stochastic search algorithm for Gaussian graphical models called the mode oriented stochastic search. Our algorithm relies on the existence of a method to accurately and efficiently approximate the marginal likelihood associated with a graphical model when it cannot be computed in closed form. To this end, we develop a new Laplace approximation method to the normalizing constant of a GWishart distribution. We show that combining the mode oriented stochastic search with our marginal likelihood estimation method leads to excellent results with respect to other techniques discussed in the literature. We also describe how to perform inference through Bayesian model averaging based on the reduced set of graphical models identified. Finally, we give a novel stochastic search technique for multivariate regression models.
A SINful Approach to Model Selection for Gaussian Concentration Graphs
, 2003
"... A multivariate Gaussian graphical Markov model for an undirected graph G, also called a covariance selection model or concentration graph model, is defined in terms of the Markov properties, i.e., conditional independences associated with G, which in turn are equivalent to specified zeroes among t ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
A multivariate Gaussian graphical Markov model for an undirected graph G, also called a covariance selection model or concentration graph model, is defined in terms of the Markov properties, i.e., conditional independences associated with G, which in turn are equivalent to specified zeroes among the set of pairwise partial correlation coe#cients. By means of Fisher's ztransformation and Sidak's correlation inequality, conservative simultaneous confidence intervals for the entire set of partial correlations can be obtained, leading to a simple method for model selection that controls the overall error rate for incorrect edge inclusion. The simultaneous pvalues corresponding to the partial correlations are partitioned into three disjoint sets, a significant set S, an indeterminate set I, and a nonsignificant set N. Our SIN model selection method selects two graphs, a graph GSI whose edges correspond to the set I, and a more conservative graph GS whose edges correspond to S only. Prior information about the presence and/or absence of particular edges can be incorporated readily. Similar considerations apply to covariance graph models, which are defined in terms of marginal independence rather than conditional independence. 1.
Accelerating Bayesian structural inference for nondecomposable Gaussian Graphical Models
"... We make several contributions in accelerating approximate Bayesian structural inference for nondecomposable GGMs. Our first contribution is to show how to efficiently compute a BIC or Laplace approximation to the marginal likelihood of nondecomposable graphs using convex methods for precision matr ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We make several contributions in accelerating approximate Bayesian structural inference for nondecomposable GGMs. Our first contribution is to show how to efficiently compute a BIC or Laplace approximation to the marginal likelihood of nondecomposable graphs using convex methods for precision matrix estimation. This optimization technique can be used as a fast scoring function inside standard Stochastic Local Search (SLS) for generating posterior samples. Our second contribution is a novel framework for efficiently generating large sets of highquality graph topologies without performing local search. This graph proposal method, which we call “Neighborhood Fusion” (NF), samples candidate Markov blankets at each node using sparse regression techniques. Our third contribution is a hybrid method combining the complementary strengths of NF and SLS. Experimental results in structural recovery and prediction tasks demonstrate that NF and hybrid NF/SLS outperform stateoftheart local search methods, on both synthetic and realworld datasets, when realistic computational limits are imposed.
Bayesian covariance selection
 ISDS Discussion Paper
, 2004
"... We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic gr ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic graphs (Bayesian networks). We show how to produce and explore the corresponding association networks by Bayesian model averaging across the models identified. We illustrate the use of HdBCS with an example from a largescale gene expression study of breast cancer.
Graph selection with GGMselect
"... Abstract: Applications on inference of biological networks have raised a strong interest on the problem of graph estimation in highdimensional Gaussian graphical model. To handle this problem, we propose a twostage procedure which first builds a family of candidate graphs from the data and then se ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract: Applications on inference of biological networks have raised a strong interest on the problem of graph estimation in highdimensional Gaussian graphical model. To handle this problem, we propose a twostage procedure which first builds a family of candidate graphs from the data and then selects one graph among this family according to a dedicated criterion. This estimation procedure is shown to be consistent in a highdimensional setting and its risk is controlled by a nonasymptotic oraclelike inequality. A nice behavior on numerical experiments corroborates these theoretical results. The procedure is implemented in the Rpackage GGMselect available online.