Results 1 
5 of
5
Combining experiments to discover linear cyclic models with latent variables
 In AISTATS 2010
, 2010
"... We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure conta ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure contains feedback loops and latent variables, while the experiments can involve surgical or ‘soft ’ interventions on one or multiple variables at a time. The algorithm is ‘online’ in the sense that it combines the results from any set of available experiments, can incorporate background knowledge and resolves conflicts that arise from combining results from different experiments. In addition we provide a necessary and sufficient condition that (i) determines when the algorithm can uniquely return the true graph, and (ii) can be used to select the next best experiment until this condition is satisfied. We demonstrate the method by applying it to simulated data and the flow cytometry data of Sachs et al (2005). 1
. Combining Experiments to Discover Linear Cyclic Models with Latent Variables
"... We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure conta ..."
Abstract
 Add to MetaCart
We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure contains feedback loops and latent variables, while the experiments can involve surgical or ‘soft ’ interventions on one or multiple variables at a time. The algorithm is ‘online’ in the sense that it combines the results from any set of available experiments, can incorporate background knowledge and resolves conflicts that arise from combining results from different experiments. In addition we provide a necessary and sufficient condition that (i) determines when the algorithm can uniquely return the true graph, and (ii) can be used to select the next best experiment until this condition is satisfied. We demonstrate the method by applying it to simulated data and the flow cytometry data of Sachs et al (2005). 1
Causal KL: Evaluating Causal Discovery
"... The two most commonly used criteria for assessing causal model discovery with artificial data are editdistance and KullbackLeibler divergence, measured from the true model to the learned model. Both of these metrics maximally reward the true model. However, we argue that they are both insufficient ..."
Abstract
 Add to MetaCart
The two most commonly used criteria for assessing causal model discovery with artificial data are editdistance and KullbackLeibler divergence, measured from the true model to the learned model. Both of these metrics maximally reward the true model. However, we argue that they are both insufficiently discriminating in judging the relative merits of false models. Edit distance, for example, fails to distinguish between strong and weak probabilistic dependencies. KL divergence, on the other hand, rewards equally all statistically equivalent models, regardless of their different causal claims. We propose an augmented KL divergence, which we call Causal KL (CKL), which takes into account causal relationships which distinguish between observationally equivalent models. Results are presented for three variants of CKL, showing that Causal KL works well in practice. Keywords: evaluating causal discovery, KullbackLeibler divergence, edit distance, Causal KL (CKL)
Published In Combining Experiments to Discover Linear Cyclic Models with Latent Variables
, 2010
"... We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure c ..."
Abstract
 Add to MetaCart
We present an algorithm to infer causal relations between a set of measured variables on the basis of experiments on these variables. The algorithm assumes that the causal relations are linear, but is otherwise completely general: It provides consistent estimates when the true causal structure contains feedback loops and latent variables, while the experiments can involve surgical or ‘soft ’ interventions on one or multiple variables at a time. The algorithm is ‘online’ in the sense that it combines the results from any set of available experiments, can incorporate background knowledge and resolves conflicts that arise from combining results from different experiments. In addition we provide a necessary and sufficient condition that (i) determines when the algorithm can uniquely return the true graph, and (ii) can be used to select the next best experiment until this condition is satisfied. We demonstrate the method by applying it to simulated data and the flow cytometry data of Sachs et al (2005). 1
How Occam’s Razor Provides a Neat Definition of Direct Causation
"... In this paper we show that the application of Occam’s razor to the theory of causal Bayes nets gives us a neat definition of direct causation. In particular we show that Occam’s razor implies Woodward’s (2003) definition of direct causation, provided suitable intervention variables exist and the ..."
Abstract
 Add to MetaCart
In this paper we show that the application of Occam’s razor to the theory of causal Bayes nets gives us a neat definition of direct causation. In particular we show that Occam’s razor implies Woodward’s (2003) definition of direct causation, provided suitable intervention variables exist and the causal Markov condition (CMC) is satisfied. We also show how Occam’s razor can account for direct causal relationships Woodward style when only stochastic intervention variables are available. 1