Results 1  10
of
11
Probability distributions with summary graph structure
, 2008
"... A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the generating conditional densities. For the joint density that then results after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to such, possibly severe distortions in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov
Bernoulli 0(00)
, 2010
"... A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class contains r ..."
Abstract
 Add to MetaCart
(Show Context)
A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class contains regression graphs. Regression graphs are a type of socalled chain graph and describe stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that result after possible marginalising or conditioning, we introduce summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences that remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to derive these properties of summary graphs and to translate them into special types of paths in graphs. Keywords: concentration graph; directed acyclic graph; endogenous variables; graphical Markov model; independence graph; multivariate regression chain; partial closure; partial inversion; triangular system 1. Motivation, some previous and some of the new results Motivation Graphical Markov models are probability distributions defined for a d V × 1 random vector variable Y V whose component variables may be discrete or continuous and whose joint density f V satisfies the independence statements specified directly by an associated graph as well as those implied the graph. The set of all such statements is the independence structure captured by the graph. One such type of graph was introduced for sequences of regression by It is an outstanding feature of regression graph models that their implications can be derived after marginalising over some variables, say in set M, or after conditioning on others, say in set C. In particular, graphs can be obtained for node set N = V \ {C, M} that capture precisely the independence structure implied by a generating graph in node set V for the distribution of Y N given Y C . Such graphs are called independencepreserving, when they can be used to derive the independence structure that would have resulted from the generating graph by conditioning on a larger node set {C, c} or by marginalising over a larger node set {M, m}. Two types of such classes are known. One is the subclass of the much larger class of MC graphs of Koster A third class of this type is the summary graph of Wermuth, The warning signals for distortions provided by summary graphs are essential for understanding consequences of a given data generating process with respect to dependences in addition to independences. For this, some special properties of the types of generating graph will be introduced as well as specific requirements on the types of generating process. These lead to families of distributions that are said to be generated over parent graphs.
Causal transmission in reducedform models
"... Abstract We propose a method to explore the causal transmission of a catalyst variable through two endogenous variables of interest. The method is based on the reducedform system formed from the conditional distribution of the two endogenous variables given the catalyst. The method combines elemen ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We propose a method to explore the causal transmission of a catalyst variable through two endogenous variables of interest. The method is based on the reducedform system formed from the conditional distribution of the two endogenous variables given the catalyst. The method combines elements from instrumental variable analysis and Cholesky decomposition of structural vector autoregressions. We give conditions for uniqueness of the causal transmission.
Casecontrol studies for rare diseases: improved estimation of
"... Abstract. To capture the dependencesof a diseaseon severalrisk factors, a challengeis to combinemodelbasedestimation with evidencebasedarguments. Standardcasecontrol methods allow estimation of the dependences of a rare disease on several regressors via logistic regressions. For casecontrol stud ..."
Abstract
 Add to MetaCart
Abstract. To capture the dependencesof a diseaseon severalrisk factors, a challengeis to combinemodelbasedestimation with evidencebasedarguments. Standardcasecontrol methods allow estimation of the dependences of a rare disease on several regressors via logistic regressions. For casecontrol studies, the sampling design leads to samples from two different populations and for the set of regressors in every logistic regression, these samples are then mixed and taken as given observations. But, it is the differences in independence structures of regressors for cases and for controls that can improve logistic regression estimates and guide us to the important feature dependences that are specific to the diseased. A casecontrol study on laryngeal cancer is used as illustration.
Binary
"... models of marginal independence: a comparison of different approaches ..."
(Show Context)
Sequences of regressions and their dependences
"... ABSTRACT: In this paper, we study sequences of regressions in joint or single responses given a set of context variables, where a dependence structure of interest is captured by a regression graph. These graphs have nodes representing random variables and three types of edge. Their set of missing ed ..."
Abstract
 Add to MetaCart
(Show Context)
ABSTRACT: In this paper, we study sequences of regressions in joint or single responses given a set of context variables, where a dependence structure of interest is captured by a regression graph. These graphs have nodes representing random variables and three types of edge. Their set of missing edges defines the independence structure of the graph provided two properties are used that are not common to all probability distributions, named the intersection and the composition property. We derive the additionally needed properties for tracing the effects of single active paths and for excluding any canceling of effects due to several paths connecting the same pair of nodes. For this, we use the notion of a generating process for the joint distribution and derive new properties of an edge matrix calculus for transforming graphs. One key is the Mmatrix property of each regularized square edge matrix, others are the proposed notions of traceable regressions and of singleton transitivity.