Results 1  10
of
12
Matrix representations and independencies in directed acyclic graphs
 Ann. Statist
, 2008
"... For a directed acyclic graph, there are two known criteria to decide whether any specific conditional independence statement is implied for all distributions factorized according to the given graph. Both criteria are based on special types of path in graphs. They are called separation criteria becau ..."
Abstract

Cited by 10 (9 self)
 Add to MetaCart
For a directed acyclic graph, there are two known criteria to decide whether any specific conditional independence statement is implied for all distributions factorized according to the given graph. Both criteria are based on special types of path in graphs. They are called separation criteria because independence holds whenever the conditioning set is a separating set in a graph theoretical sense. We introduce and discuss an alternative approach using binary matrix representations of graphs in which zeros indicate independence statements. A matrix condition is shown to give a new path criterion for separation and to be equivalent to each of the previous two path criteria. 1. Introduction. We
Probability distributions with summary graph structure
, 2008
"... A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the generating conditional densities. For the joint density that then results after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to such, possibly severe distortions in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov
Sequences of regressions and their independences
, 2012
"... Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Ordered sequences of univariate or multivariate regressions provide statistical modelsfor analysingdata fromrandomized, possiblysequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, prove criteria for Markov equivalence anddiscussthenotion of simpler statistical covering models. Knowledgeof Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.
CHANGING PARAMETERS BY PARTIAL MAPPINGS
, 2008
"... Changes between different sets of parameters are often needed in multivariate statistical modeling such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative wellfitting co ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Changes between different sets of parameters are often needed in multivariate statistical modeling such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative wellfitting constrained models, compatibility judgements of seemingly distinct constrained models, or different reference priors under alternative parameterizations. We introduce and discuss a partial mapping, called partial replication and relate it to a more complex mapping, called partial inversion. Both operations are used to decompose matrix operations, to explain recursion relations among sets of linear parameters, to change between different types of linear models, to approximate maximumlikelihood estimates in exponential family models under independence constraints, and to switch partially between sets of canonical and moment parameters in exponential family distributions or between sets of corresponding maximumlikelihood estimates.
BY NANNY WERMUTH
, 2008
"... Undetected confounding may severely distort the effect of an explanatory variable on a response variable, as defined by a stepwise datagenerating process. The best known type of distortion, which we call direct confounding, arises from an unobserved explanatory variable common to a response and its ..."
Abstract
 Add to MetaCart
Undetected confounding may severely distort the effect of an explanatory variable on a response variable, as defined by a stepwise datagenerating process. The best known type of distortion, which we call direct confounding, arises from an unobserved explanatory variable common to a response and its main explanatory variable of interest. It is relevant mainly for observational studies, since it is avoided by successful randomization. By contrast, indirect confounding, which we identify in this paper, is an issue also for intervention studies. For general stepwisegenerating processes, we provide matrix and graphical criteria to decide which types of distortion may be present, when they are absent and how they are avoided. We then turn to linear systems without other types of distortion, but with indirect confounding. For such systems, the magnitude of distortion in a leastsquares regression coefficient is derived and shown to be estimable, so that it becomes possible to recover the effect of the generating process from the distorted coefficient.
Submitted to the Annals of Statistics PROBABILITY DISTRIBUTIONS WITH SUMMARY GRAPH
"... A joint density of several variables may satisfy a possibly large set of independence statements, called its independence structure. Often this structure is fully representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities o ..."
Abstract
 Add to MetaCart
A joint density of several variables may satisfy a possibly large set of independence statements, called its independence structure. Often this structure is fully representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the distribution generated. For densities that then result after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to possibly severe distortions of these two types in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov
Contents
"... Abstract: We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms ..."
Abstract
 Add to MetaCart
Abstract: We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms of marginal correlations. This contrasts with the loglinear formulation of joint probabilities in which parameters measure conditional associations given all remaining variables. The new formulation permits useful comparisons of different types of graphical Markov models and leads to a close approximation of Gaussian orthant probabilities.
Sequences of regressions and their dependences
"... ABSTRACT: In this paper, we study sequences of regressions in joint or single responses given a set of context variables, where a dependence structure of interest is captured by a regression graph. These graphs have nodes representing random variables and three types of edge. Their set of missing ed ..."
Abstract
 Add to MetaCart
ABSTRACT: In this paper, we study sequences of regressions in joint or single responses given a set of context variables, where a dependence structure of interest is captured by a regression graph. These graphs have nodes representing random variables and three types of edge. Their set of missing edges defines the independence structure of the graph provided two properties are used that are not common to all probability distributions, named the intersection and the composition property. We derive the additionally needed properties for tracing the effects of single active paths and for excluding any canceling of effects due to several paths connecting the same pair of nodes. For this, we use the notion of a generating process for the joint distribution and derive new properties of an edge matrix calculus for transforming graphs. One key is the Mmatrix property of each regularized square edge matrix, others are the proposed notions of traceable regressions and of singleton transitivity.