Results 1 
5 of
5
Learning Bayesian Networks from Data: An InformationTheory Based Approach
"... This paper provides algorithms that use an informationtheoretic analysis to learn Bayesian network structures from data. Based on our threephase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional indepe ..."
Abstract

Cited by 93 (5 self)
 Add to MetaCart
This paper provides algorithms that use an informationtheoretic analysis to learn Bayesian network structures from data. Based on our threephase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.
Learning Bayesian Networks from Data: An Efficient Approach Based on Information Theory
, 1997
"... This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach. Based on our threephase construction mechanism, two efficient algorithms have been developed. One of our algorithms deals with a special case where the ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach. Based on our threephase construction mechanism, two efficient algorithms have been developed. One of our algorithms deals with a special case where the node ordering is given, the algorithm only require ) ( 2 N O CI tests and is correct given that the underlying model is DAGFaithful [Spirtes et. al., 1996]. The other algorithm deals with the general case and requires ) ( 4 N O conditional independence (CI) tests. It is correct given that the underlying model is monotone DAGFaithful (see Section 4.4). A system based on these algorithms has been developed and distributed through the Internet. The empirical results show that our approach is efficient and reliable. 1 Introduction The Bayesian network is a powerful knowledge representation and reasoning tool under conditions of uncertainty. A Bayesian network is a directed acyclic graph ...
A new approach for learning belief networks using independence criteria
 International Journal of Approximate Reasoning
, 2000
"... q ..."
An Algorithm for Finding Minimum dSeparating Sets in Belief Networks
 Proceedings of the twelfth Conference of Uncertainty in Artificial Intelligence
, 1996
"... The criterion commonly used in directed acyclic graphs (dags) for testing graphical independence is the wellknown dseparation criterion. It allows us to build graphical representations of dependency models (usually probabilistic dependency models) in the form of belief networks, which make possibl ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
The criterion commonly used in directed acyclic graphs (dags) for testing graphical independence is the wellknown dseparation criterion. It allows us to build graphical representations of dependency models (usually probabilistic dependency models) in the form of belief networks, which make possible an easy interpretation and management of independence relationships, without reference to numerical parameters (conditional probabilities). In this paper we study the following combinatorial problem: to find the minimum dseparating set for two nodes in a dag. This set would represent the minimum information necessary to prevent these two nodes to influence each other. The solution of this basic problem and of some of its extensions can be useful in several ways, as we will see later. Our solution is based on a twosteps process: first, we reduce the original problem to the simpler one of finding a minimum separating set in an undirected graph, and second, we develop an algorithm for solvi...
A Deterministic Annealing Approach to Learning Bayesian Networks
"... Editor: Graphical Models bring together two different mathematical areas: graph theory and probability theory. Recent years have witnessed an increase in the significance of the role played by Graphical Models in solving several machine learning problems. Graphical Models can be either directed or u ..."
Abstract
 Add to MetaCart
Editor: Graphical Models bring together two different mathematical areas: graph theory and probability theory. Recent years have witnessed an increase in the significance of the role played by Graphical Models in solving several machine learning problems. Graphical Models can be either directed or undirected. Undirected Graphical Models are also called Bayesian networks. The manual construction of Bayesian Networks is usually time consuming and error prone. Therefore, there has been a significant interest in algorithms for the automatic induction of Bayesian Networks structures from data. This paper presents a new method for the induction of Bayesian Networks structures. The proposed method uses the concept of deterministic annealing to propose an iterative searchscore learning algorithm that utilizes a global optimization technique. Deterministic annealing is a global optimization technique that was originally used for clustering, regression,...etc and similar optimization problems. The experimental results show that the proposed approach achieves very promising results compared to other structure learning approaches.