Results 1  10
of
25
Binary models for marginal independence
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B
, 2005
"... A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a versi ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
A number of authors have considered multivariate Gaussian models for marginal independence. In this paper we develop models for binary data with the same independence structure. The models can be parameterized based on Möbius inversion and maximum likelihood estimation can be performed using a version of the Iterated Conditional Fitting algorithm. The approach is illustrated on a simple example. Relations to multivariate logistic and dependence ratio models are discussed.
2005) Towards Characterizing Markov Equivalence Classes of Directed Acyclic Graphs with Latent Variables. UAI
 Proceedings of the 21th Conference on Uncertainty in Artificial Intelligence, AUAI
, 2005
"... It is well known that there may be many causal explanations that are consistent with a given set of data. Recent work has been done to represent the common aspects of these explanations into one representation. In this paper, we address what is less well known: how do the relationships common to eve ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
It is well known that there may be many causal explanations that are consistent with a given set of data. Recent work has been done to represent the common aspects of these explanations into one representation. In this paper, we address what is less well known: how do the relationships common to every causal explanation among the observed variables of some DAG process change in the presence of latent variables? Ancestral graphs provide a class of graphs that can encode conditional independence relations that arise in DAG models with latent and selection variables. In this paper we present a set of orientation rules that construct the Markov equivalence class representative for ancestral graphs, given a member of the equivalence class. These rules are sound and complete. We also show that when the equivalence class includes a DAG, the equivalence class representative is the essential graph for the said DAG.
Graphical methods for efficient likelihood inference in gaussian covariance models
 Journal of Machine Learning
, 2008
"... Abstract. In graphical modelling, a bidirected graph encodes marginal independences among random variables that are identified with the vertices of the graph. We show how to transform a bidirected graph into a maximal ancestral graph that (i) represents the same independence structure as the origi ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Abstract. In graphical modelling, a bidirected graph encodes marginal independences among random variables that are identified with the vertices of the graph. We show how to transform a bidirected graph into a maximal ancestral graph that (i) represents the same independence structure as the original bidirected graph, and (ii) minimizes the number of arrowheads among all ancestral graphs satisfying (i). Here the number of arrowheads of an ancestral graph is the number of directed edges plus twice the number of bidirected edges. In Gaussian models, this construction can be used for more efficient iterative maximization of the likelihood function and to determine when maximum likelihood estimates are equal to empirical counterparts. 1.
The hidden life of latent variables: Bayesian learning with mixed graph models
, 2008
"... Directed acyclic graphs (DAGs) have been widely used as a representation of conditional independence in machine learning and statistics. Moreover, hidden or latent variables are often an important component of graphical models. However, DAG models suffer from an important limitation: the family of D ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
Directed acyclic graphs (DAGs) have been widely used as a representation of conditional independence in machine learning and statistics. Moreover, hidden or latent variables are often an important component of graphical models. However, DAG models suffer from an important limitation: the family of DAGs is not closed under marginalization of hidden variables. This means that in general we cannot use a DAG to represent the independencies over a subset of variables in a larger DAG. Directed mixed graphs (DMGs) are a representation that includes DAGs as a special case, and overcomes this limitation. This paper introduces algorithms for performing Bayesian inference in Gaussian and probit DMG models. An important requirement for inference is the characterization of the distribution over parameters of the models. We introduce a new distribution for covariance matrices of Gaussian DMGs. We discuss and illustrate how several Bayesian machine learning tasks can benefit from the principle presented here: the power to model dependencies that are generated from hidden variables, but without necessarily modelling such variables explicitly.
Causal Inference and Reasoning in Causally Insufficient Systems
, 2006
"... The big question that motivates this dissertation is the following: under what conditions and to what extent can passive observations inform us of the structure of causal connections among a set of variables and of the potential outcome of an active intervention on some of the variables? The partic ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
The big question that motivates this dissertation is the following: under what conditions and to what extent can passive observations inform us of the structure of causal connections among a set of variables and of the potential outcome of an active intervention on some of the variables? The particular concern here revolves around the common kind of situations where the variables of interest, though measurable themselves, may suffer from confounding due to unobserved common causes. Relying on a graphical representation of causally insufficient systems called maximal ancestral graphs, and two wellknown principles widely discussed in the literature, the causal Markov and Faithfulness conditions, we show that the FCI algorithm, a sound inference procedure in the literature for inferring features of the unknown causal structure from facts of probabilistic independence and dependence, is, with some extra sound inference rules, also complete in the sense that any feature of the causal structure left undecided by the inference procedure is indeed underdetermined by facts of probabilistic independence and dependence. In addition, we consider the issue of quantitative reasoning about effects of local interventions with the FCIlearnable features of the unknown causal structure. We improve and generalize two important pieces of work in the literature about identifying intervention effects. We also provide some preliminary study of the testability of the
Hifh dimensional sparse covariance estimation via directed acyclic graphs
, 2009
"... We present a graphbased technique for estimating sparse covariance matrices and their inverses from highdimensional data. The method is based on learning a directed acyclic graph (DAG) and estimating parameters of a multivariate Gaussian distribution based on a DAG. For inferring the underlying DA ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We present a graphbased technique for estimating sparse covariance matrices and their inverses from highdimensional data. The method is based on learning a directed acyclic graph (DAG) and estimating parameters of a multivariate Gaussian distribution based on a DAG. For inferring the underlying DAG we use the PCalgorithm [27] and for estimating the DAGbased covariance matrix and its inverse, we use a Cholesky decomposition approach which provides a positive (semi)definite sparse estimate. We present a consistency result in the highdimensional framework and we compare our method with the Glasso [12, 8, 2] for simulated and real data.
Graphical modelling of multivariate time series with latent variables
, 2005
"... Abstract. In time series analysis, inference about causee®ect relationships among multiple times series is commonly based on the concept of Granger causality, which exploits temporal structure to achieve causal ordering of dependent variables. One major problem in the application of Granger causali ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. In time series analysis, inference about causee®ect relationships among multiple times series is commonly based on the concept of Granger causality, which exploits temporal structure to achieve causal ordering of dependent variables. One major problem in the application of Granger causality for the identi¯cation of causal relationships is the possible presence of latent variables that a®ect the measured components and thus lead to socalled spurious causalities. In this paper, we describe a new graphical approach for modelling the dependence structure of multivariate stationary time series that are a®ected by latent variables. Is is based on mixed graphs in which directed edges represent direct in°uences among the variables while dashed edgesdirected or undirectedindicate associations that are induced by latent variables. For Gaussian processes, this approach leads to vector autoregressive processes with errors that are not independent but correlated according to the dashed edges in the graph. We show that these models can be viewed as graphical ARMA models that satisfy the Granger causality restrictions encoded by general mixed graphs. We discuss identi¯ability of the parameters and illustrate the approach by an example. 1.
Generalized measurement models
, 2004
"... document without permission of its author may be prohibited by law. ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
document without permission of its author may be prohibited by law.
Bayesian inference for Gaussian mixed graph models
 Proceedings of 22nd Conference on Uncertainty in Artificial Intelligence
, 2006
"... We introduce priors and algorithms to perform Bayesian inference in Gaussian models defined by acyclic directed mixed graphs. Such a class of graphs, composed of directed and bidirected edges, is a representation of conditional independencies that is closed under marginalization and arises naturall ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
We introduce priors and algorithms to perform Bayesian inference in Gaussian models defined by acyclic directed mixed graphs. Such a class of graphs, composed of directed and bidirected edges, is a representation of conditional independencies that is closed under marginalization and arises naturally from causal models which allow for unmeasured confounding. Monte Carlo methods and a variational approximation for such models are presented. Our algorithms for Bayesian inference allow the evaluation of posterior distributions for several quantities of interest, including causal effects that are not identifiable from data alone but could otherwise be inferred where informative prior knowledge about confounding is available. 1