Results 1 
6 of
6
Maximum Likelihood Estimation in Gaussian AMP Chain Graph Models and Gaussian Ancestral Graph Models
, 2004
"... The AMP Markov property is a recently proposed alternative Markov property for chain graphs. In the case of continuous variables with a joint multivariate Gaussian distribution, it is the AMP rather than the earlier introduced LWF Markov property that is coherent with datageneration by natural bloc ..."
Abstract

Cited by 12 (8 self)
 Add to MetaCart
The AMP Markov property is a recently proposed alternative Markov property for chain graphs. In the case of continuous variables with a joint multivariate Gaussian distribution, it is the AMP rather than the earlier introduced LWF Markov property that is coherent with datageneration by natural blockrecursive regressions. In this paper, we show that maximum likelihood estimates in Gaussian AMP chain graph models can be obtained by combining generalized least squares and iterative proportional fitting to an iterative algorithm. In an appendix, we give useful convergence results for iterative partial maximization algorithms that apply in particular to the described algorithm. Key words: AMP chain graph, graphical model, iterative partial maximization, multivariate normal distribution, maximum likelihood estimation 1
Multiple testing and error control in Gaussian graphical model selection
 Statistical Science
"... Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of cond ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Abstract. Graphical models provide a framework for exploration of multivariate dependence patterns. The connection between graph and statistical model is made by identifying the vertices of the graph with the observed variables and translating the pattern of edges in the graph into a pattern of conditional independences that is imposed on the variables ’ joint distribution. Focusing on Gaussian models, we review classical graphical models. For these models the defining conditional independences are equivalent to vanishing of certain (partial) correlation coefficients associated with individual edges that are absent from the graph. Hence, Gaussian graphical model selection can be performed by multiple testing of hypotheses about vanishing (partial) correlation coefficients. We show and exemplify how this approach allows one to perform model selection while controlling error rates for incorrect edge inclusion. Key words and phrases: Acyclic directed graph, Bayesian network, bidirected graph, chain graph, concentration graph, covariance graph, DAG, graphical model, multiple testing, undirected graph. 1.
Graphical modelling of multivariate time series
, 2001
"... Abstract. We introduce graphical time series models for the analysis of dynamic relationships among variables in multivariate time series. The modelling approach is based on the notion of strong Granger causality and can be applied to time series with nonlinear dependencies. The models are derived ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Abstract. We introduce graphical time series models for the analysis of dynamic relationships among variables in multivariate time series. The modelling approach is based on the notion of strong Granger causality and can be applied to time series with nonlinear dependencies. The models are derived from ordinary time series models by imposing constraints that are encoded by mixed graphs. In these graphs each component series is represented by a single vertex and directed edges indicate possible Grangercausal relationships between variables while undirected edges are used to map the contemporaneous dependence structure. We introduce various notions of Grangercausal Markov properties and discuss the relationships among them and to other Markov properties that can be applied in this context.
Characterizing Markov equivalence classes for AMP chain graph models
 The Annals of Statistics
, 2005
"... 2 Chain graphs (CG) ( = adicyclic graphs) use undirected and directed edges to represent simultaneously both structural and associative dependences.. Like acyclic directed graphs (ADGs), the CG associated with a given statistical model may not be unique, so CGs fall into Markov equivalence classes, ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
2 Chain graphs (CG) ( = adicyclic graphs) use undirected and directed edges to represent simultaneously both structural and associative dependences.. Like acyclic directed graphs (ADGs), the CG associated with a given statistical model may not be unique, so CGs fall into Markov equivalence classes, which may be superexponentially large, leading to unidentifiability and computational inefficiency in model search and selection. It is shown here that under the AnderssonMadiganPerlman (AMP) Markov interpretation of a CG, each Markovequivalence class can be uniquely represented by a single distinguished CG, the AMP essential graph, that is itself simultaneously Markov equivalent to all CGs in the AMP Markov equivalence class. A complete characterization of AMP essential graphs is obtained. Like the essential graph previously introduced for ADGs, the AMP essential graph will play a fundamental role for inference and model search and selection for AMP CG models.
Sequences of regressions and their independences
, 2012
"... Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, prove criteria for Markov equivalence anddiscussthenotion of simpler statistical covering models. Knowledgeof Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.
MINIMAL SUFFICIENT CAUSATION AND DIRECTED ACYCLIC GRAPHS
, 906
"... Notions of minimal sufficient causation are incorporated within the directed acyclic graph causal framework. Doing so allows for the graphical representation of sufficient causes and minimal sufficient causes on causal directed acyclic graphs while maintaining all of the properties of causal directe ..."
Abstract
 Add to MetaCart
Notions of minimal sufficient causation are incorporated within the directed acyclic graph causal framework. Doing so allows for the graphical representation of sufficient causes and minimal sufficient causes on causal directed acyclic graphs while maintaining all of the properties of causal directed acyclic graphs. This in turn provides a clear theoretical link between two major conceptualizations of causality: one counterfactualbased and the other based on a more mechanistic understanding of causation. The theory developed can be used to draw conclusions about the sign of the conditional covariances among variables. 1. Introduction. Two