Results 21  30
of
46
Independence for Full Conditional Measures, Graphoids and Bayesian Networks
, 2007
"... This paper examines definitions of independence for events and variables in the context of full conditional measures; that is, when conditional probability is a primitive notion and conditioning is allowed on null events. Several independence concepts are evaluated with respect to graphoid propertie ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper examines definitions of independence for events and variables in the context of full conditional measures; that is, when conditional probability is a primitive notion and conditioning is allowed on null events. Several independence concepts are evaluated with respect to graphoid properties; we show that properties of weak union, contraction and intersection may fail when null events are present. We propose a concept of “full” independence, characterize the form of a full conditional measure under full independence, and suggest how to build a theory of Bayesian networks that accommodates null events.
Faithfulness in Chain Graphs: The Discrete Case
"... This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that the strictly positive discrete probability distributions with the prescribed sample space that factorize according to a chain graph G with dimension d have positive Lebesgue measure wrt R ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper deals with chain graphs under the classic LauritzenWermuthFrydenberg interpretation. We prove that the strictly positive discrete probability distributions with the prescribed sample space that factorize according to a chain graph G with dimension d have positive Lebesgue measure wrt R d, whereas those that factorize according to G but are not faithful to it have zero Lebesgue measure wrt R d. This means that, in the measuretheoretic sense described, almost all the strictly positive discrete probability distributions with the prescribed sample space that factorize according to G are faithful to it.
Robust IndependenceBased Causal Structure Learning in Absence of Adjacency Faithfulness
"... This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of condit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of conditional independencies that cannot be modeled faithfully. Both cases lead to uncertainty about specific parts of the skeleton of the causal graph. This is modeled by an fpattern. We proved that our Very Conservative PC algorithm is able to correctly learn the fpattern. We argue that the solution also applies for the finite sample case if we accept that only strong edges can be identified. Experiments based on simulations show that the rate of false edge removals is significantly reduced, at the expense of uncertainty on the skeleton and a higher sensitivity for accidental correlations. 1
Information Fusion, Causal Probabilistic Network And Probanet II: Inference Algorithms and Probanet System
 Proc. 1st Intl. Workshop on Image Analysis and Information Fusion
, 1997
"... As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
As an extension of an overview paper [Pan and McMichael, 1997] on information fusion and Causal Probabilistic Networks (CPN), this paper formalizes kernel algorithms for probabilistic inferences upon CPNs. Information fusion is realized through updating joint probabilities of the variables upon the arrival of new evidences or new hypotheses. Kernel algorithms for some dominant methods of inferences are formalized from discontiguous, mathematicsoriented literatures, with gaps lled in with regards to computability and completeness. In particular, possible optimizations on causal tree algorithm, graph triangulation and junction tree algorithm are discussed. Probanet has been designed and developed as a generic shell, or say, mother system for CPN construction and application. The design aspects and current status of Probanet are described. A few directions for research and system development are pointed out, including hierarchical structuring of network, structure decomposition and adaptive inference algorithms. This paper thus has a nature of integration including literature review, algorithm formalization and future perspective.
Conservative IndependenceBased Causal Structure Learning in Absence of Adjacency Faithfulness
"... This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of condit ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of conditional independencies that cannot be modeled faithfully. Both cases lead to uncertainty about specific parts of the skeleton of the causal graph. These ambiguities are modeled by an fpattern. We prove that our Adjacency Conservative PC algorithm is able to correctly learn the fpattern. We argue that the solution also applies for the finite sample case if we accept that only strong edges can be identified. Experiments based on simulations and the ALARM benchmark model show that the rate of false edge removals is significantly reduced, at the expense of uncertainty on the skeleton and a higher sensitivity for accidental correlations. Keywords: 1.
Challenges in the analysis of massthroughput data: A technical commentary from the statistical machine learning perspective
 CANCER INFORMATICS
, 2006
"... Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of massthroughput data. The novel nature and highdimensionality in such datasets pose a series of nontrivial data analysis problems. This technical commentary discusses ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Sound data analysis is critical to the success of modern molecular medicine research that involves collection and interpretation of massthroughput data. The novel nature and highdimensionality in such datasets pose a series of nontrivial data analysis problems. This technical commentary discusses the problems of overfi tting, error estimation, curse of dimensionality, causal versus predictive modeling, integration of heterogeneous types of data, and lack of standard protocols for data analysis. We attempt to shed light on the nature and causes of these problems and to outline viable methodological approaches to overcome them.
Reading Dependencies from PolytreeLike Bayesian Networks Revisited
"... We present a graphical criterion for reading dependencies from the minimal directed independence map G of a graphoid p, under the assumption that G is a polytree and p satisfies weak transitivity. We prove that the criterion is sound and complete. We argue that assuming weak transitivity is not too ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present a graphical criterion for reading dependencies from the minimal directed independence map G of a graphoid p, under the assumption that G is a polytree and p satisfies weak transitivity. We prove that the criterion is sound and complete. We argue that assuming weak transitivity is not too restrictive. 1
Causal Conclusions that Flip Repeatedly and Their Justification
"... Over the past two decades, several consistent procedures have been designed to infer causal conclusions from observational data. We prove that if the true causal network might be an arbitrary, linear Gaussian network or a discrete Bayes network, then every unambiguous causal conclusion produced by a ..."
Abstract
 Add to MetaCart
Over the past two decades, several consistent procedures have been designed to infer causal conclusions from observational data. We prove that if the true causal network might be an arbitrary, linear Gaussian network or a discrete Bayes network, then every unambiguous causal conclusion produced by a consistent method from nonexperimental data is subject to reversal as the sample size increases any finite number of times. That result, called the causal flipping theorem, extends prior results to the effect that causal discovery cannot be reliable on a given sample size. We argue that since repeated flipping of causal conclusions is unavoidable in principle for consistent methods, the best possible discovery methods are consistent methods that retract their earlier conclusions no more than necessary. A series of simulations of various methods across a wide range of sample sizes illustrates concretely both the theorem and the principle of comparing methods in terms of retractions. 1