Results 1 
5 of
5
2008), A Score Based Ranking of the Edges for the PC Algorithm
 in Proceedings of the Fourth European Workshop on Probabilistic Graphical Models
"... The result of applying the PC learning algorithm can depend of the order in which independence tests are carried out. Even if these tests are ordered by increasing size of conditional sets, the PC algorithm does not take into account which edges are weaker in order to be considered to be removed be ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The result of applying the PC learning algorithm can depend of the order in which independence tests are carried out. Even if these tests are ordered by increasing size of conditional sets, the PC algorithm does not take into account which edges are weaker in order to be considered to be removed before the stronger edges. This paper proposes a new learning algorithm which scores the edges according to a Bayesian metric and adds them to the final graph according to this score. Then, conditional independence tests are carried out to remove edgess as in the PC algorithm. Also, this algorithm is hybridized with a variation of the PC algorithm consisting in determining minimum size cut sets between two nodes to study the deletion of an edge. Some experiments are carried out to evaluate the performance of the new proposals against the PC algorithm. 1
Constraint relaxation for learning the structure of Bayesian networks
, 2009
"... This paper introduces constraint relaxation, a new strategy for learning the structure of Bayesian networks. Constraint relaxation identifies and “relaxes ” possibly inaccurate independence constraints on the structure of the model. We describe a heuristic algorithm for constraint relaxation that co ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper introduces constraint relaxation, a new strategy for learning the structure of Bayesian networks. Constraint relaxation identifies and “relaxes ” possibly inaccurate independence constraints on the structure of the model. We describe a heuristic algorithm for constraint relaxation that combines greedy search in the space of undirected skeletons with edge orientation based on the constraints. This approach produces significant improvements in the structural accuracy of the learned models compared to four wellknown structure learning algorithms in an empirical evaluation using data sampled from both realworld and randomly generated networks. 1
Conservative IndependenceBased Causal Structure Learning in Absence of Adjacency Faithfulness
"... This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of condit ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
This paper presents an extension to the Conservative PC algorithm which is able to detect violations of adjacency faithfulness under causal sufficiency and triangle faithfulness. Violations can be characterized by pseudoindependent relations and equivalent edges, both generating a pattern of conditional independencies that cannot be modeled faithfully. Both cases lead to uncertainty about specific parts of the skeleton of the causal graph. These ambiguities are modeled by an fpattern. We prove that our Adjacency Conservative PC algorithm is able to correctly learn the fpattern. We argue that the solution also applies for the finite sample case if we accept that only strong edges can be identified. Experiments based on simulations and the ALARM benchmark model show that the rate of false edge removals is significantly reduced, at the expense of uncertainty on the skeleton and a higher sensitivity for accidental correlations. Keywords: 1.
Improving Accuracy of ConstraintBased Structure Learning
"... Hybrid algorithms for learning the structure of Bayesian networks combine techniques from both the constraintbased and searchandscore paradigms of structure learning. One class of hybrid approaches uses a constraintbased algorithm to learn an undirected skeleton identifying edges that should appea ..."
Abstract
 Add to MetaCart
(Show Context)
Hybrid algorithms for learning the structure of Bayesian networks combine techniques from both the constraintbased and searchandscore paradigms of structure learning. One class of hybrid approaches uses a constraintbased algorithm to learn an undirected skeleton identifying edges that should appear in the final network. This skeleton is used to constrain the model space considered by a searchandscore algorithm to orient the edges and produce a final model structure. At small sample sizes, the performance of models learned using this hybrid approach do not achieve likelihood as high as models learned by unconstrained search. Low performance is a result of errors made by the skeleton identification algorithm, particularly false negative errors, which lead to an overconstrained search space. These errors are often attributed to “noisy” hypothesis tests that are run during skeleton identification. However, at least three specific sources of error have been identified in the literature: unsuitable hypothesis tests, lowpower hypothesis tests, and unexplained dseparation. No previous work has considered these sources of error in combination. We determine the relative importance of each source individually and in combination. We identify that lowpower tests are the primary source of false negative errors, and show that these errors can be corrected by a novel application of statistical power analysis. The result is a new hybrid algorithm for learning the structure of Bayesian networks which produces models with equivalent likelihood to models produced by unconstrained greedy search, using only a fraction of the time. 1