Results 1  10
of
27
A characterization of Markov equivalence classes for acyclic digraphs
, 1995
"... Undirected graphs and acyclic digraphs (ADGs), as well as their mutual extension to chain graphs, are widely used to describe dependencies among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow e ..."
Abstract

Cited by 92 (7 self)
 Add to MetaCart
Undirected graphs and acyclic digraphs (ADGs), as well as their mutual extension to chain graphs, are widely used to describe dependencies among variables in multivariate distributions. In particular, the likelihood functions of ADG models admit convenient recursive factorizations that often allow explicit maximum likelihood estimates and that are well suited to building Bayesian networks for expert systems. Whereas the undirected graph associated with a dependence model is uniquely determined, there may, however, be many ADGs that determine the same dependence ( = Markov) model. Thus, the family of all ADGs with a given set of vertices is naturally partitioned into Markovequivalence classes, each class being associated with a unique statistical model. Statistical procedures, such as model selection or model averaging, that fail to take into account these equivalence classes, may incur substantial computational or other inefficiencies. Here it is shown that each Markovequivalence class is uniquely determined by a single chain graph, the essential graph, that is itself simultaneously Markov equivalent to all ADGs in the equivalence class. Essential graphs are characterized, a polynomialtime algorithm for their construction is given, and their applications to model selection and other statistical
An Alternative Markov Property for Chain Graphs
 Scand. J. Statist
, 1996
"... Graphical Markov models use graphs, either undirected, directed, or mixed, to represent possible dependences among statistical variables. Applications of undirected graphs (UDGs) include models for spatial dependence and image analysis, while acyclic directed graphs (ADGs), which are especially conv ..."
Abstract

Cited by 49 (4 self)
 Add to MetaCart
Graphical Markov models use graphs, either undirected, directed, or mixed, to represent possible dependences among statistical variables. Applications of undirected graphs (UDGs) include models for spatial dependence and image analysis, while acyclic directed graphs (ADGs), which are especially convenient for statistical analysis, arise in such fields as genetics and psychometrics and as models for expert systems and Bayesian belief networks. Lauritzen, Wermuth, and Frydenberg (LWF) introduced a Markov property for chain graphs, which are mixed graphs that can be used to represent simultaneously both causal and associative dependencies and which include both UDGs and ADGs as special cases. In this paper an alternative Markov property (AMP) for chain graphs is introduced, which in some ways is a more direct extension of the ADG Markov property than is the LWF property for chain graph. 1 INTRODUCTION Graphical Markov models use graphs, either undirected, directed, or mixed, to represent...
Chain Graph Models and their Causal Interpretations
 B
, 2001
"... Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultim ..."
Abstract

Cited by 48 (4 self)
 Add to MetaCart
Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultimately fallacious interpretations of chain graphs that are often invoked, implicitly or explicitly. These interpretations also lead to awed methods for applying background knowledge to model selection. We present a valid interpretation by showing how the distribution corresponding to a chain graph may be generated as the equilibrium distribution of dynamic models with feedback. These dynamic interpretations lead to a simple theory of intervention, extending the theory developed for DAGs. Finally, we contrast chain graph models under this interpretation with simultaneous equation models which have traditionally been used to model feedback in econometrics. Keywords: Causal model; cha...
Binary models for marginal independence
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B
, 2005
"... A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a versi ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. The approach is illustrated on a simple example. Relations to multivariate logistic and dependence ratio models are discussed.
Partial inversion for linear systems and partial closure of independence graphs
 BIT, Numer. Math
"... We introduce and study a calculus for realvalued square matrices, called partial inversion, and an associated calculus for binary square matrices. The first, applied to systems of recursive linear equations, generates new sets of parameters for different types of statistical joint response models. ..."
Abstract

Cited by 14 (11 self)
 Add to MetaCart
We introduce and study a calculus for realvalued square matrices, called partial inversion, and an associated calculus for binary square matrices. The first, applied to systems of recursive linear equations, generates new sets of parameters for different types of statistical joint response models. The corresponding generating graphs are directed and acyclic. The second calculus, applied to matrix representations of independence graphs, gives chain graphs induced by such a generating graph. Chain graphs are more complex independence graphs associated with recursive joint response models. Missing edges in independence graphs coincide with structurally zero parameters in linear systems. A wide range of consequences of an assumed independence structure can be derived by partial closure, but computationally efficient algorithms still need to be developed for applications to very large graphs.
Normal Linear Regression Models with Recursive Graphical Markov Structure
 J. MULTIVARIATE ANAL
, 1998
"... A multivariate normal statistical model defined by the Markov pr er deter by an acyclic digric admits ar efactorof its likelihood function (LF) into the pr duct of conditional LFs, eachfactor having the for of a classical multivar  linear rear model (# MANOVA model).Her these modelsar extended ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
A multivariate normal statistical model defined by the Markov pr er deter by an acyclic digric admits ar efactorof its likelihood function (LF) into the pr duct of conditional LFs, eachfactor having the for of a classical multivar  linear rear model (# MANOVA model).Her these modelsar extended in anatur way tonor linear rear models whose LFs continue to admit suchr efactorr frr which maximum likelihoodestimator and likelihoodr (LR) test statistics can beder ed by classical linear methods. The centrdistr  of the LR test statisticfor testing one such multivariv norv linear rear model against another isder ed, and there of theseresesion models to blockr enor linear systems is established. It is shown how a collection of nonnested dependentnor linear rear models (# seemingly unringly ringly can be combined into a single multivariv norvlinear rn grear model by imposing apar set of graphical Markov (# conditional independence) restrictions.
Covariance Chains
 Bernoulli
, 2006
"... Covariance matrices which can be arranged in tridiagonal form are called covariance chains. They are used to clarify some issues of parameter equivalence and of independence equivalence for linear models in which a set of latent variables influences a set of observed variables. For this purpose, ort ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
Covariance matrices which can be arranged in tridiagonal form are called covariance chains. They are used to clarify some issues of parameter equivalence and of independence equivalence for linear models in which a set of latent variables influences a set of observed variables. For this purpose, orthogonal decompositions for covariance chains are derived first in explicit form. Covariance chains are also contrasted to concentration chains, for which estimation is explicit and simple. For this purpose, maximumlikelihood equations are derived first for exponential families when some parameters satisfy zero value constraints. From these equations explicit estimates are obtained, which are asymptotically efficient, and they are applied to covariance chains. Simulation results confirm the satisfactory behaviour of the explicit covariance chain estimates also in moderatesize samples.
Graphical methods for efficient likelihood inference in gaussian covariance models
 Journal of Machine Learning
, 2008
"... Abstract. In graphical modelling, a bidirected graph encodes marginal independences among random variables that are identified with the vertices of the graph. We show how to transform a bidirected graph into a maximal ancestral graph that (i) represents the same independence structure as the origi ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract. In graphical modelling, a bidirected graph encodes marginal independences among random variables that are identified with the vertices of the graph. We show how to transform a bidirected graph into a maximal ancestral graph that (i) represents the same independence structure as the original bidirected graph, and (ii) minimizes the number of arrowheads among all ancestral graphs satisfying (i). Here the number of arrowheads of an ancestral graph is the number of directed edges plus twice the number of bidirected edges. In Gaussian models, this construction can be used for more efficient iterative maximization of the likelihood function and to determine when maximum likelihood estimates are equal to empirical counterparts. 1.
Probability distributions with summary graph structure
, 2008
"... A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the generating conditional densities. For the joint density that then results after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to such, possibly severe distortions in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov