Results 21  30
of
45
Causality: objectives and assessment
 In NIPS 2008 workshop on causality, volume 7. JMLR W&CP, in press, 2009a
"... The NIPS 2008 workshop on causality provided a forum for researchers from different horizons to share their view on causal modeling and address the difficult question of assessing causal models. There has been a vivid debate on properly separating the notion of causality from particular models such ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
The NIPS 2008 workshop on causality provided a forum for researchers from different horizons to share their view on causal modeling and address the difficult question of assessing causal models. There has been a vivid debate on properly separating the notion of causality from particular models such as graphical models, which have been dominating the field in the past few years. Part of the workshop was dedicated to discussing the results of a challenge, which offered a wide variety of applications of causal modeling. We have regrouped in these proceedings the best papers presented. Most lectures were videotaped or recorded. All information regarding the challenge and the lectures are found at
Information Fusion, Causal Probabilistic Network And Probanet II: Inference Algorithms and Probanet System
 Proc. 1st Intl. Workshop on Image Analysis and Information Fusion
, 1997
"... As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the arrival of new evidences or new hypotheses. Kernel algorithms for some dominant methods of inferences are formalized from discontiguous, mathematicsoriented literatures, with gaps lled in with regards to computability and completeness. In particular, possible optimizations on causal tree algorithm, graph triangulation and junction tree algorithm are discussed. Probanet has been designed and developed as a generic shell, or say, mother system for CPN construction and application. The design aspects and current status of Probanet are described. A few directions for research and system development are pointed out, including hierarchical structuring of network, structure decomposition and adaptive inference algorithms. This paper thus has a nature of integration including literature review, algorithm formalization and future perspective.
Faithfulness in Chain Graphs: The Discrete Case
"... This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that the strictly positive discrete probability distributions with the prescribed sample space that factorize according to a chain graph G with dimension d have positive Lebesgue measure wrt R ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that the strictly positive discrete probability distributions with the prescribed sample space that factorize according to a chain graph G with dimension d have positive Lebesgue measure wrt R d, whereas those that factorize according to G but are not faithful to it have zero Lebesgue measure wrt R d. This means that, in the measuretheoretic sense described, almost all the strictly positive discrete probability distributions with the prescribed sample space that factorize according to G are faithful to it.
Robust IndependenceBased Causal Structure Learning in Absence of Adjacency Faithfulness
"... This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of condit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of conditional independencies that cannot be modeled faithfully. Both cases lead to uncertainty about specific parts of the skeleton of the causal graph. This is modeled by an fpattern. We proved that our Very Conservative PC algorithm is able to correctly learn the fpattern. We argue that the solution also applies for the finite sample case if we accept that only strong edges can be identified. Experiments based on simulations show that the rate of false edge removals is significantly reduced, at the expense of uncertainty on the skeleton and a higher sensitivity for accidental correlations. 1
Challenges in the analysis of massthroughput data: A technical commentary from the statistical machine learning perspective
 Cancer Informatics
"... Abstract: Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of massthroughput data. The novel nature and highdimensionality in such datasets pose a series of nontrivial data analysis problems. This technical commentary ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of massthroughput data. The novel nature and highdimensionality in such datasets pose a series of nontrivial data analysis problems. This technical commentary discusses the problems of overfi tting, error estimation, curse of dimensionality, causal versus predictive modeling, integration of heterogeneous types of data, and lack of standard protocols for data analysis. We attempt to shed light on the nature and causes of these problems and to outline viable methodological approaches to overcome them. 1.
Reading Dependencies from PolytreeLike Bayesian Networks Revisited
"... We present a graphical criterion for reading dependencies from the minimal directed independence map G of a graphoid p, under the assumption that G is a polytree and p satisfies weak transitivity. We prove that the criterion is sound and complete. We argue that assuming weak transitivity is not too ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present a graphical criterion for reading dependencies from the minimal directed independence map G of a graphoid p, under the assumption that G is a polytree and p satisfies weak transitivity. We prove that the criterion is sound and complete. We argue that assuming weak transitivity is not too restrictive. 1
Conservative IndependenceBased Causal Structure Learning in Absence of Adjacency Faithfulness
"... This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of condit ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of conditional independencies that cannot be modeled faithfully. Both cases lead to uncertainty about specific parts of the skeleton of the causal graph. These ambiguities are modeled by an fpattern. We prove that our Adjacency Conservative PC algorithm is able to correctly learn the fpattern. We argue that the solution also applies for the finite sample case if we accept that only strong edges can be identified. Experiments based on simulations and the ALARM benchmark model show that the rate of false edge removals is significantly reduced, at the expense of uncertainty on the skeleton and a higher sensitivity for accidental correlations. Keywords: 1.
Practically Perfect
"... We prove that perfect distributions exist when using a finite number of bits to represent the parameters of a Bayesian network. In addition, we provide an upper bound on the probability of sampling a nonperfect distribution when using a fixed number of bits for the parameters and that the upper bou ..."
Abstract
 Add to MetaCart
We prove that perfect distributions exist when using a finite number of bits to represent the parameters of a Bayesian network. In addition, we provide an upper bound on the probability of sampling a nonperfect distribution when using a fixed number of bits for the parameters and that the upper bound approaches zero exponentially fast as one increases the number of bits. We also provide an upper bound on the number of bits needed to guarantee that a distribution sampled from a uniform Dirichlet distribution is perfect with probability greater than 1/2. 1