Results 1 
8 of
8
A SINful approach to Gaussian graphical model selection
 Journal of Statistical Planning and Inference
"... Abstract. Multivariate Gaussian graphical models are defined in terms of Markov properties, i.e., conditional independences associated with the underlying graph. Thus, model selection can be performed by testing these conditional independences, which are equivalent to specified zeroes among certain ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
Abstract. Multivariate Gaussian graphical models are defined in terms of Markov properties, i.e., conditional independences associated with the underlying graph. Thus, model selection can be performed by testing these conditional independences, which are equivalent to specified zeroes among certain (partial) correlation coefficients. For concentration graphs, covariance graphs, acyclic directed graphs, and chain graphs (both LWF and AMP), we apply Fisher’s ztransformation, ˇ Sidák’s correlation inequality, and Holm’s stepdown procedure, to simultaneously test the multiple hypotheses obtained from the Markov properties. This leads to a simple method for model selection that controls the overall error rate for incorrect edge inclusion. In practice, we advocate partitioning the simultaneous pvalues into three disjoint sets, a significant set S, an indeterminate set I, and a nonsignificant set N. Then our SIN model selection method selects two graphs, a graph whose edges correspond to the union of S and I, and a more conservative graph whose edges correspond to S only. Prior information about the presence and/or absence of particular edges can be incorporated readily. 1.
Binary models for marginal independence
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B
, 2005
"... A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a versi ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. The approach is illustrated on a simple example. Relations to multivariate logistic and dependence ratio models are discussed.
Multiple testing and error control in Gaussian graphical model selection
 Statistical Science
"... Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of cond ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of conditional independences that is imposed on the variables ’ joint distribution. Focusing on Gaussian models, we review classical graphical models. For these models the defining conditional independences are equivalent to vanishing of certain (partial) correlation coefficients associated with individual edges that are absent from the graph. Hence, Gaussian graphical model selection can be performed by multiple testing of hypotheses about vanishing (partial) correlation coefficients. We show and exemplify how this approach allows one to perform model selection while controlling error rates for incorrect edge inclusion. Key words and phrases: Acyclic directed graph, Bayesian network, bidirected graph, chain graph, concentration graph, covariance graph, DAG, graphical model, multiple testing, undirected graph. 1.
Counting and locating the solutions of polynomial systems of maximum likelihood equations, II: The BehrensFisher problem
, 2006
"... Let µ be a pdimensional vector, and let Σ1 and Σ2 be p × p positive definite covariance matrices. On being given random samples of sizes N1 and N2 from independent multivariate normal populations Np(µ, Σ1) and Np(µ, Σ2), respectively, the BehrensFisher problem is to solve the likelihood equations ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Let µ be a pdimensional vector, and let Σ1 and Σ2 be p × p positive definite covariance matrices. On being given random samples of sizes N1 and N2 from independent multivariate normal populations Np(µ, Σ1) and Np(µ, Σ2), respectively, the BehrensFisher problem is to solve the likelihood equations for estimating the unknown parameters µ, Σ1, and Σ2. It is wellknown that the likelihood equations cannot be solved explicitly, and this has led to many different approaches to the BehrensFisher problem and with a commensurately large number of publications on the topic. We prove that for N1, N2> p, there are, almost surely, exactly 3 p real or complex solutions of the likelihood equations. We propose a new iterative algorithm for solving the system of likelihood equations. For the case in which p = 2, we utilize Monte Carlo simulation to estimate how frequently a typical BehrensFisher is likely to have multiple real solutions; we find that multiple real solutions occur infrequently.
Characterizing Markov equivalence classes for AMP chain graph models
 The Annals of Statistics
, 2005
"... 2 Chain graphs (CG) ( = adicyclic graphs) use undirected and directed edges to represent simultaneously both structural and associative dependences.. Like acyclic directed graphs (ADGs), the CG associated with a given statistical model may not be unique, so CGs fall into Markov equivalence classes, ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
2 Chain graphs (CG) ( = adicyclic graphs) use undirected and directed edges to represent simultaneously both structural and associative dependences.. Like acyclic directed graphs (ADGs), the CG associated with a given statistical model may not be unique, so CGs fall into Markov equivalence classes, which may be superexponentially large, leading to unidentifiability and computational inefficiency in model search and selection. It is shown here that under the AnderssonMadiganPerlman (AMP) Markov interpretation of a CG, each Markovequivalence class can be uniquely represented by a single distinguished CG, the AMP essential graph, that is itself simultaneously Markov equivalent to all CGs in the AMP Markov equivalence class. A complete characterization of AMP essential graphs is obtained. Like the essential graph previously introduced for ADGs, the AMP essential graph will play a fundamental role for inference and model search and selection for AMP CG models.
Identification and likelihood inference for recursive linear models with correlated errors
, 2007
"... In recursive linear models, the multivariate normal joint distribution of all variables exhibits a dependence structure induced by recursive systems of linear structural equations. Such models appear in particular in seemingly unrelated regressions, structural equation modelling, simultaneous equati ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In recursive linear models, the multivariate normal joint distribution of all variables exhibits a dependence structure induced by recursive systems of linear structural equations. Such models appear in particular in seemingly unrelated regressions, structural equation modelling, simultaneous equation systems, and in Gaussian graphical modelling. We show that recursive linear models that are ‘bowfree’ are wellbehaved statistical models, namely, they are everywhere identifiable and form curved exponential families. Here, ‘bowfree ’ refers to models satisfying the condition that if a variable x occurs in the structural equation for y, then the errors for x and y are uncorrelated. For the computation of maximum likelihood estimates in ‘bowfree ’ recursive linear models we introduce the Residual Iterative Conditional Fitting (RICF) algorithm. Compared to existing algorithms RICF is easily implemented requiring only least squares computations, has clear convergence properties, and finds parameter estimates in closed form whenever possible. 1
Faithfulness in Chain Graphs: The Gaussian Case
"... This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that almost all the regular Gaussian distributions that factorize with respect to a chain graph are faithful to it. This result has three important consequences. First, chain graphs are more po ..."
Abstract
 Add to MetaCart
This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that almost all the regular Gaussian distributions that factorize with respect to a chain graph are faithful to it. This result has three important consequences. First, chain graphs are more powerful than undirected graphs and acyclic directed graphs for representing regular Gaussian distributions, as some of these distributions can be represented exactly by the former but not by the latter. Second, the moralization and cseparation criteria for reading independencies from a chain graph are complete, in the sense that they identify all the independencies that can be identified from the chain graph alone. Third, some definitions of equivalence in chain graphs coincide and, thus, they have the same graphical characterization. 1
Sixth European Workshop on Probabilistic Graphical Models, Granada, Spain, 2012 Learning AMP Chain Graphs under Faithfulness
"... This paper deals with chain graphs under the alternative AnderssonMadiganPerlman (AMP) interpretation. In particular, we present a constraint based algorithm for learning an AMP chain graph a given probability distribution is faithful to. We also show that the extension of Meek’s conjecture to AMP ..."
Abstract
 Add to MetaCart
This paper deals with chain graphs under the alternative AnderssonMadiganPerlman (AMP) interpretation. In particular, we present a constraint based algorithm for learning an AMP chain graph a given probability distribution is faithful to. We also show that the extension of Meek’s conjecture to AMP chain graphs does not hold, which compromises the development of efficient and correct score+search learning algorithms under assumptions weaker than faithfulness. 1