Results 1  10
of
11
Causal inference using the algorithmic Markov condition
, 2008
"... Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to g ..."
Abstract

Cited by 11 (11 self)
 Add to MetaCart
Inferring the causal structure that links n observables is usually basedupon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information anddescribe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution. email:
Causal Inference By Choosing Graphs With Most Plausible Markov Kernels
 MARKOV KERNELS, NINTH INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE AND MATHEMATICS
, 2006
"... We propose a new inference rule for estimating causal structure that underlies the observed statistical dependencies among n random variables. Our method is based on comparing the conditional distributions of variables given their direct causes (the socalled "Markov kernels") for all hypothetica ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
We propose a new inference rule for estimating causal structure that underlies the observed statistical dependencies among n random variables. Our method is based on comparing the conditional distributions of variables given their direct causes (the socalled "Markov kernels") for all hypothetical causal directions and choosing the most plausible one. We consider those Markov kernels most plausible, which maximize the (conditional) entropies constrained by their observed first moment (expectation) and second moments (variance and covariance with its direct causes) based on their given domain. In this
Distinguishing between cause and effect
, 2008
"... We describe eight data sets that together formed the CauseEffectPairs task in the Causality Challenge #2: PotLuck competition. Each set consists of a sample of a pair of statistically dependent random variables. One variable is known to cause the other one, but this information was hidden from the ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
We describe eight data sets that together formed the CauseEffectPairs task in the Causality Challenge #2: PotLuck competition. Each set consists of a sample of a pair of statistically dependent random variables. One variable is known to cause the other one, but this information was hidden from the participants; the task was to identify which of the two variables was the cause and which one the effect, based upon the observed sample. The data sets were chosen such that we expect common agreement on the ground truth. Even though part of the statistical dependences may also be due to hidden common causes, common sense tells us that there is a significant causeeffect relation between the two variables in each pair. We also present baseline results using three different causal inference methods.
On causally asymmetric versions of Occam’s Razor and their relation to thermodynamics
, 2007
"... and their relation to thermodynamics ..."
Automatic discovery of latent variable models
 Machine Learning Dpt., CMU
, 2005
"... representing the official policies, either expressed or implied, of any sponsoring institution, the U.S. government or any other entity. ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
representing the official policies, either expressed or implied, of any sponsoring institution, the U.S. government or any other entity.
Causal reasoning by evaluating the complexity of conditional densities with kernel methods
 Neurocomputing
"... We propose a method to quantify the complexity of conditional probability measures by a Hilbert space seminorm of the logarithm of its density. The concept of Reproducing Kernel Hilbert Spaces (RKHS) is a flexible tool to define such a seminorm by choosing an appropriate kernel. We present several e ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We propose a method to quantify the complexity of conditional probability measures by a Hilbert space seminorm of the logarithm of its density. The concept of Reproducing Kernel Hilbert Spaces (RKHS) is a flexible tool to define such a seminorm by choosing an appropriate kernel. We present several examples with artificial datasets where our kernelbased complexity measure is consistent with our intuitive understanding of complexity of densities. The intention behind the complexity measure is to provide a new approach to inferring causal directions. The idea is that the factorization of the joint probability measure P(effect,cause) into P(effectcause)P(cause) leads typically to “simpler” and “smoother ” terms than the factorization into P(causeeffect)P(effect). Since the conventional constraintbased approach of causal discovery is not able to determine the causal direction between only two variables, our inference principle can in particular be helpful when combined with other existing methods. We provide several simple examples with realworld data where the true causal directions indeed lead to simpler (conditional) densities.
ARTICLE IN PRESS
, 2008
"... www.elsevier.com/locate/neucom Causal reasoning by evaluating the complexity of conditional densities with kernel methods ..."
Abstract
 Add to MetaCart
www.elsevier.com/locate/neucom Causal reasoning by evaluating the complexity of conditional densities with kernel methods