Results 1  10
of
45
ANCESTRAL GRAPH MARKOV MODELS
, 2002
"... This paper introduces a class of graphical independence models that is closed under marginalization and conditioning but that contains all DAG independence models. This class of graphs, called maximal ancestral graphs, has two attractive features: there is at most one edge between each pair of verti ..."
Abstract

Cited by 76 (18 self)
 Add to MetaCart
This paper introduces a class of graphical independence models that is closed under marginalization and conditioning but that contains all DAG independence models. This class of graphs, called maximal ancestral graphs, has two attractive features: there is at most one edge between each pair of vertices; every missing edge corresponds to an independence relation. These features lead to a simple parameterization of the corresponding set of distributions in the Gaussian case.
Stratified exponential families: Graphical models and model selection
 Annals of Statistics
, 2001
"... JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JS ..."
Abstract

Cited by 54 (6 self)
 Add to MetaCart
JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Chain Graph Models and their Causal Interpretations
 B
, 2001
"... Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultim ..."
Abstract

Cited by 48 (4 self)
 Add to MetaCart
Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultimately fallacious interpretations of chain graphs that are often invoked, implicitly or explicitly. These interpretations also lead to awed methods for applying background knowledge to model selection. We present a valid interpretation by showing how the distribution corresponding to a chain graph may be generated as the equilibrium distribution of dynamic models with feedback. These dynamic interpretations lead to a simple theory of intervention, extending the theory developed for DAGs. Finally, we contrast chain graph models under this interpretation with simultaneous equation models which have traditionally been used to model feedback in econometrics. Keywords: Causal model; cha...
Multimodality of the Likelihood in the Bivariate Seemingly Unrelated Regression Model
, 2002
"... Seemingly unrelated regression (SUR) models traditionally appear in econometrics but recently also emerged in likelihood factorizations of Gaussian graphical models. The literature on maximum likelihood estimation in SUR seems not to mention the possibility of a multimodal likelihood. ..."
Abstract

Cited by 25 (15 self)
 Add to MetaCart
Seemingly unrelated regression (SUR) models traditionally appear in econometrics but recently also emerged in likelihood factorizations of Gaussian graphical models. The literature on maximum likelihood estimation in SUR seems not to mention the possibility of a multimodal likelihood.
A SINful approach to Gaussian graphical model selection
 Journal of Statistical Planning and Inference
"... Abstract. Multivariate Gaussian graphical models are defined in terms of Markov properties, i.e., conditional independences associated with the underlying graph. Thus, model selection can be performed by testing these conditional independences, which are equivalent to specified zeroes among certain ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
Abstract. Multivariate Gaussian graphical models are defined in terms of Markov properties, i.e., conditional independences associated with the underlying graph. Thus, model selection can be performed by testing these conditional independences, which are equivalent to specified zeroes among certain (partial) correlation coefficients. For concentration graphs, covariance graphs, acyclic directed graphs, and chain graphs (both LWF and AMP), we apply Fisher’s ztransformation, ˇ Sidák’s correlation inequality, and Holm’s stepdown procedure, to simultaneously test the multiple hypotheses obtained from the Markov properties. This leads to a simple method for model selection that controls the overall error rate for incorrect edge inclusion. In practice, we advocate partitioning the simultaneous pvalues into three disjoint sets, a significant set S, an indeterminate set I, and a nonsignificant set N. Then our SIN model selection method selects two graphs, a graph whose edges correspond to the union of S and I, and a more conservative graph whose edges correspond to S only. Prior information about the presence and/or absence of particular edges can be incorporated readily. 1.
On Chain Graph Models For Description Of Conditional Independence Structures
 Ann. Statist
, 1998
"... This paper deals with chain graphs (CGs) which allow both directed and undirected edges. This class of graphs, introduced by Lauritzen and Wermuth [15], generalizes both UGs and DAGs. To establish the semantics of CGs one should associate an independency model to every CG. Some steps were already ma ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
This paper deals with chain graphs (CGs) which allow both directed and undirected edges. This class of graphs, introduced by Lauritzen and Wermuth [15], generalizes both UGs and DAGs. To establish the semantics of CGs one should associate an independency model to every CG. Some steps were already made. Lauritzen and Wermuth [16] intended to use CGs to describe independency models for strictly positive probability distributions and introduced the concept of the chain Markov property which is analogous to the concept of causal input list for DAGs. Lauritzen and Frydenberg [17, 9] generalized the concept of moral graph and introduced a moralization criterion for reading independency statements from a CG. Frydenberg [9] characterized CGs with the same Markov ON CHAIN GRAPH MODELS 3 property (that is producing the same CGmodel) and Andersson, Madigan and Perlman [3] used special CGs to represent uniquely classes of Markov equivalent DAGs. Whittaker [31] in his book gave several examples of the use of CGs, and other recent works also deal with them [6, 20, 23, 30], the most comprehensive account is provided by the book [19]. Several results proved here were already presented (without proof) in our previous conference contribution [5]. An alternative approach to the generalization of UGs and DAGs was started by Cox and Wermuth [7] who introduced a wider class of jointresponse chain graphs which allow also 'dashed' directed and undirected edges in addition to the classic 'solid' directed and undirected edges treated in this paper. Andersson, Madigan and Perlman [1] introduced an alternative Markov property to give an interpretation to those jointresponse CGs which combine dashed directed edges with solid undirected edges (of course, another independency model is associated...
Graphical models and exponential families
 In Proceedings of the 14th Annual Conference on Uncertainty in Arti cial Intelligence (UAI98
, 1998
"... We provide a classification of graphical models according to their representation as subfamilies of exponential families. Undirected graphical models with no hidden variables are linear exponential families (LEFs), directed acyclic graphical models and chain graphs with no hidden variables, includin ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
We provide a classification of graphical models according to their representation as subfamilies of exponential families. Undirected graphical models with no hidden variables are linear exponential families (LEFs), directed acyclic graphical models and chain graphs with no hidden variables, including Bayesian networks with several families of local distributions, are curved exponential families (CEFs) and graphical models with hidden variables are stratified exponential families (SEFs). An SEF is a finite union of CEFs satisfying a frontier condition. In addition, we illustrate how one can automatically generate independence and nonindependence constraints on the distributions over the observable variables implied by a Bayesian network with hidden variables. The relevance of these results for model selection is examined. 1
Iterative conditional fitting for Gaussian ancestral graph models
 In M. Chickering and J. Halpern (Eds.), Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence
, 2004
"... Ancestral graph models, introduced by Richardson and Spirtes (2002), generalize both Markov random fields and Bayesian networks to a class of graphs with a global Markov property that is closed under conditioning and marginalization. By design, ancestral graphs encode precisely the conditional indep ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Ancestral graph models, introduced by Richardson and Spirtes (2002), generalize both Markov random fields and Bayesian networks to a class of graphs with a global Markov property that is closed under conditioning and marginalization. By design, ancestral graphs encode precisely the conditional independence structures that can arise from Bayesian networks with selection and unobserved (hidden/latent) variables. Thus, ancestral graph models provide a potentially very useful framework for exploratory model selection when unobserved variables might be involved in the datagenerating process but no particular hidden structure can be specified. In this paper, we present the Iterative Conditional Fitting (ICF) algorithm for maximum likelihood estimation in Gaussian ancestral graph models. The name reflects that in each step of the procedure a conditional distribution is estimated, subject to constraints, while a marginal distribution is held fixed. This approach is in duality to the wellknown Iterative Proportional Fitting algorithm, in which marginal distributions are fitted while conditional distributions are held fixed. 1
Partial inversion for linear systems and partial closure of independence graphs
 BIT, Numer. Math
"... We introduce and study a calculus for realvalued square matrices, called partial inversion, and an associated calculus for binary square matrices. The first, applied to systems of recursive linear equations, generates new sets of parameters for different types of statistical joint response models. ..."
Abstract

Cited by 14 (11 self)
 Add to MetaCart
We introduce and study a calculus for realvalued square matrices, called partial inversion, and an associated calculus for binary square matrices. The first, applied to systems of recursive linear equations, generates new sets of parameters for different types of statistical joint response models. The corresponding generating graphs are directed and acyclic. The second calculus, applied to matrix representations of independence graphs, gives chain graphs induced by such a generating graph. Chain graphs are more complex independence graphs associated with recursive joint response models. Missing edges in independence graphs coincide with structurally zero parameters in linear systems. A wide range of consequences of an assumed independence structure can be derived by partial closure, but computationally efficient algorithms still need to be developed for applications to very large graphs.
Maximum Likelihood Estimation in Gaussian AMP Chain Graph Models and Gaussian Ancestral Graph Models
, 2004
"... The AMP Markov property is a recently proposed alternative Markov property for chain graphs. In the case of continuous variables with a joint multivariate Gaussian distribution, it is the AMP rather than the earlier introduced LWF Markov property that is coherent with datageneration by natural bloc ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
The AMP Markov property is a recently proposed alternative Markov property for chain graphs. In the case of continuous variables with a joint multivariate Gaussian distribution, it is the AMP rather than the earlier introduced LWF Markov property that is coherent with datageneration by natural blockrecursive regressions. In this paper, we show that maximum likelihood estimates in Gaussian AMP chain graph models can be obtained by combining generalized least squares and iterative proportional fitting to an iterative algorithm. In an appendix, we give useful convergence results for iterative partial maximization algorithms that apply in particular to the described algorithm. Key words: AMP chain graph, graphical model, iterative partial maximization, multivariate normal distribution, maximum likelihood estimation 1