Results 1  10
of
10
The maxmin hillclimbing bayesian network structure learning algorithm
 Machine Learning
, 2006
"... Abstract. We present a new algorithm for Bayesian network structure learning, called MaxMin HillClimbing (MMHC). The algorithm combines ideas from local learning, constraintbased, and searchandscore techniques in a principled and effective way. It first reconstructs the skeleton of a Bayesian n ..."
Abstract

Cited by 76 (7 self)
 Add to MetaCart
Abstract. We present a new algorithm for Bayesian network structure learning, called MaxMin HillClimbing (MMHC). The algorithm combines ideas from local learning, constraintbased, and searchandscore techniques in a principled and effective way. It first reconstructs the skeleton of a Bayesian network and then performs a Bayesianscoring greedy hillclimbing search to orient the edges. In our extensive empirical evaluation MMHC outperforms on average and in terms of various metrics several prototypical and stateoftheart algorithms, namely the PC, Sparse Candidate, Three Phase Dependency Analysis, Optimal Reinsertion, Greedy Equivalence Search, and Greedy Search. These are the first empirical results simultaneously comparing most of the major Bayesian network algorithms against each other. MMHC offers certain theoretical advantages, specifically over the Sparse Candidate algorithm, corroborated by our experiments. MMHC and detailed results of our study are publicly available at
Bayesian Network Structure Learning by Recursive Autonomy Identification Raanan Yehezkel ∗ Video Analytics Group
"... We propose the recursive autonomy identification (RAI) algorithm for constraintbased (CB) Bayesian network structure learning. The RAI algorithm learns the structure by sequential application of conditional independence (CI) tests, edge direction and structure decomposition into autonomous substru ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We propose the recursive autonomy identification (RAI) algorithm for constraintbased (CB) Bayesian network structure learning. The RAI algorithm learns the structure by sequential application of conditional independence (CI) tests, edge direction and structure decomposition into autonomous substructures. The sequence of operations is performed recursively for each autonomous substructure while simultaneously increasing the order of the CI test. While other CB algorithms dseparate structures and then direct the resulted undirected graph, the RAI algorithm combines the two processes from the outset and along the procedure. By this means and due to structure decomposition, learning a structure using RAI requires a smaller number of CI tests of high orders. This reduces the complexity and runtime of the algorithm and increases the accuracy by diminishing the curseofdimensionality. When the RAI algorithm learned structures from databases representing synthetic problems, known networks and natural problems, it demonstrated superiority with respect to computational complexity, runtime, structural correctness and classification accuracy over the
A New Hybrid Method for Bayesian Network Learning With Dependency Constraints
"... Abstract — A Bayes net has qualitative and quantitative aspects: The qualitative aspect is its graphical structure that corresponds to correlations among the variables in the Bayes net. The quantitative aspects are the net parameters. This paper develops a hybrid criterion for learning Bayes net str ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract — A Bayes net has qualitative and quantitative aspects: The qualitative aspect is its graphical structure that corresponds to correlations among the variables in the Bayes net. The quantitative aspects are the net parameters. This paper develops a hybrid criterion for learning Bayes net structures that is based on both aspects. We combine model selection criteria measuring data fit with correlation information from statistical tests: Given a sample d, search for a structure G that maximizes score(G, d), over the set of structures G that satisfy the dependencies detected in d. We rely on the statistical test only to accept conditional dependencies, not conditional independencies. We show how to adapt local search algorithms to accommodate the observed dependencies. Simulation studies with GES search and the BDeu/BIC scores provide evidence that the additional dependency information leads to Bayes nets that better fit the target model in distribution and structure. I.
Learning the Tree Augmented Naive Bayes Classifier from incomplete datasets
"... The Bayesian network formalism is becoming increasingly popular in many areas such as decision aid or diagnosis, in particular thanks to its inference capabilities, even when data are incomplete. For classification tasks, Naive Bayes and Augmented Naive Bayes classifiers have shown excellent perform ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The Bayesian network formalism is becoming increasingly popular in many areas such as decision aid or diagnosis, in particular thanks to its inference capabilities, even when data are incomplete. For classification tasks, Naive Bayes and Augmented Naive Bayes classifiers have shown excellent performances. Learning a Naive Bayes classifier from incomplete datasets is not difficult as only parameter learning has to be performed. But there are not many methods to efficiently learn Tree Augmented Naive Bayes classifiers from incomplete datasets. In this paper, we take up the structural em algorithm principle introduced by (Friedman, 1997) to propose an algorithm to answer this question. 1
BAYESIAN NETWORK STRUCTURAL LEARNING AND INCOMPLETE DATA
"... The Bayesian network formalism is becoming increasingly popular in many areas such as decision aid, diagnosis and complex systems control, in particular thanks to its inference capabilities, even when data are incomplete. Besides, estimating the parameters of a fixedstructure Bayesian network is ea ..."
Abstract
 Add to MetaCart
The Bayesian network formalism is becoming increasingly popular in many areas such as decision aid, diagnosis and complex systems control, in particular thanks to its inference capabilities, even when data are incomplete. Besides, estimating the parameters of a fixedstructure Bayesian network is easy. However, very few methods are capable of using incomplete cases as a base to determine the structure of a Bayesian network. In this paper, we take up the structural EM algorithm principle [9, 10] to propose an algorithm which extends the Maximum Weight Spanning Tree algorithm to deal with incomplete data. We also propose to use this extension in order to (1) speed up the structural EM algorithm or (2) in classification tasks extend the Tree Augmented Naive classifier in order to deal with incomplete data. 1.
Does QueryBased Diagnostics Work?
"... Querybased diagnostics (Agosta, Gardos, & Druzdzel, 2008) offers passive, incremental construction of diagnostic models that rest on the interaction between a diagnostician and a computerbased diagnostic system. Effectively, this approach minimizes knowledge engineering, the main bottleneck in pra ..."
Abstract
 Add to MetaCart
Querybased diagnostics (Agosta, Gardos, & Druzdzel, 2008) offers passive, incremental construction of diagnostic models that rest on the interaction between a diagnostician and a computerbased diagnostic system. Effectively, this approach minimizes knowledge engineering, the main bottleneck in practical application of Bayesian networks. While this idea is appealing, it has undergone only limited testing in practice. We describe a series of experiments that subject a prototype implementing passive, incremental model construction to a rigorous practical test. We show that the prototype’s diagnostic accuracy reaches reasonable levels after merely tens of cases and continues to increase with the number of cases, comparing favorably to state of the art approaches based on learning. 1
Bayesian Network and Variable Elimination Algorithm for Reasoning under Uncertainty
"... Abstract A common task for a Bayesian network is to perform inference by computing to determine various probabilities of interest from the model. We are using an algorithm for construction of Bayesian network from given data input from several data sources such as Oracle, Access, Excel, etc., and v ..."
Abstract
 Add to MetaCart
Abstract A common task for a Bayesian network is to perform inference by computing to determine various probabilities of interest from the model. We are using an algorithm for construction of Bayesian network from given data input from several data sources such as Oracle, Access, Excel, etc., and variable elimination algorithm for answering probabilistic queries with respect to a Bayesian network. Our algorithm makes use of XML Bayesian Interchange Format to support portability of constructed network within modules of program. The algorithm runs in time and space exponential in the tree width of the network. The variable elimination algorithm acts on a set of factors. Each factor involves a set of variables and each node in a Bayesian network is equipped with a conditional probability function that expresses the likelihood that the node will take on different values given the values of its parents. The initial sets of factors are the network’s conditional probability distributions (tables). The probability distributions constructed during variable elimination in Bayesian networks have always been denoted as probability.
A novel scalable and correct Markov boundary learning algorithm under faithfulness condition
"... In this paper, we propose a novel constraintbased Markov boundary discovery algorithm, called MBOR, that scales up to hundreds of thousands of variables. Its correctness under faithfulness condition is guaranteed. A thorough empiric evaluation of MBOR’s robustness, efficiency and scalability is pro ..."
Abstract
 Add to MetaCart
In this paper, we propose a novel constraintbased Markov boundary discovery algorithm, called MBOR, that scales up to hundreds of thousands of variables. Its correctness under faithfulness condition is guaranteed. A thorough empiric evaluation of MBOR’s robustness, efficiency and scalability is provided on synthetic databases involving thousands of variables. Our experimental results show a clear benefit in several situations: large Markov boundaries, weak associations and approximate functional dependencies among the variables. 1
Complex Activity Recognition using Granger Constrained Dynamic Bayesian Network
"... Many scenes in surveillance, sports, and other video domains involve complex multiagent activities where the agents coexist and are interacting in a timevarying manner. For example, in the surveillance domain one person may open a door of a vehicle so another person can load an object before they ..."
Abstract
 Add to MetaCart
Many scenes in surveillance, sports, and other video domains involve complex multiagent activities where the agents coexist and are interacting in a timevarying manner. For example, in the surveillance domain one person may open a door of a vehicle so another person can load an object before they both enter the vehicle. Similarly, team sports involve multiple players acting in a coordinated manner. Our goal is to model and recognize such coordinated activities in video by capturing the most discriminative Granger causal relationships between pairs of time sequences extracted from eventclusters. An activity is represented as a collection of eventclusters that can be instantaneous or occur over a period of time. And, loosely speaking, Granger causality,[1], is an explicit measure of one temporal sequence’s influence on another and is therefore ideal for explicitly capturing the causal relationships between agents. The overall training approach is shown in Figure 1, where the feature data from the activity classes are automatically clustered using a hierarchical divisive clustering algorithm. Activity profiles are then extracted from each eventcluster by
doi:10.1093/comjnl/bxs032 Exploring Causal Relationships with Streaming Features ∗
"... Causal discovery is highly desirable in science and technology. In this paper, we study a new research problem of discovery of causal relationships in the context of streaming features, where the features steam in one by one. With a Bayesian network to represent causal relationships, we propose a no ..."
Abstract
 Add to MetaCart
Causal discovery is highly desirable in science and technology. In this paper, we study a new research problem of discovery of causal relationships in the context of streaming features, where the features steam in one by one. With a Bayesian network to represent causal relationships, we propose a novel algorithm called causal discovery from streaming features (CDFSF) which consists of a twophase scheme. In the first phase, CDFSF dynamically discovers causal relationships between each feature seen so far with an arriving feature, while in the second phase CDFSF removes the false positives of each arrived feature from its current set of direct causes and effects. To improve the efficiency of CDFSF, using the symmetry properties between parents (causes) and children (effects) in a faithful Bayesian network, we present a variant of CDFSF, SCDFSF. Experimental results validate our algorithms in comparison with the existing algorithms of causal relationship discovery. 1.