### Table 1. Performance of Bayesian Belief Network

2005

"... In PAGE 15: ....3.1. Modeling IDS Using Bayesian Classifier Furthermore, Bayesian network classifier is constructed using the training data and then the classifier is used on the test data set to classify the data as an attack or normal. Table1 depicts the performance of Bayesian belief network by using the original 41 variable data set and the 17 variables reduced data set. The training and testing times for each classifier are decreased when 17 variable data set is used.... ..."

Cited by 2

### Table 1. Performance of Bayesian Belief Network

2004

"... In PAGE 4: ... Further Bayesian network classifier is constructed using the training data and then the classifier is used on the test data set to classify the data as an attack or normal. Table1 depicts the performance of Bayesian belief network by using the original 41 variable data set and the 17 variables reduced data set. The training and testing times for each classifier are decreased when 17 variable data set is used.... ..."

Cited by 2

### Table 2 Performance of Bayesian belief network Attack class 41 variables 17 variables

2004

"... In PAGE 10: ...69 Input Feature Reduction Ensemble Based Intrusion Detection System Bayesian Network Trees Figure 1 Ensemble approach for IDS. Table2 depicts the performance of the Bayesian belief network by using the original 41-variable data set and the 17-variable reduced data set. The training and testing times for each classifier decreases when the 17-variable data set is used.... ..."

### Table 2 Performance of Bayesian belief network Attack class 41 variables 17 variables

2004

"... In PAGE 10: ...69 Input Feature Reduction Ensemble Based Intrusion Detection System Bayesian Network Trees Figure 1 Ensemble approach for IDS. Table2 depicts the performance of the Bayesian belief network by using the original 41-variable data set and the 17-variable reduced data set. The training and testing times for each classifier decreases when the 17-variable data set is used.... ..."

### Table 1. Data and beliefs: an overview

2004

"... In PAGE 5: ...The basic distinction between data and beliefs yields a rich picture of epistemic dynamics (Fig. 1 and Table1 ). From a computational viewpoint, such distinction opens the way for blended approaches to implementation [20]: data structures present remarkable similarities with Bayesian networks and neural networks, while belief sets are a well-known hallmark of AGM-style belief revision [13].... ..."

Cited by 4

### Table 2: Model Performance by Fold

"... In PAGE 4: ... A value of 0 indicates that the models performed similarly. Table2 shows the paired results for each fold. Fold SVM Bayesian Belief Network d 1 84.... ..."

### Table 3: Results on Bayesian Network Repository.

"... In PAGE 6: ... We compare the algorithms using the mini-bucket based heuristics generators, namely s- AOMB, d-AOMB, s-BBMB and d-BBMB. Notice that s- BBMB is currently one of the best performing complete al- gorithms for this domain [Kask and Dechter, 2001] Table3 summarizes the results for experiments on 6 real- world belief networks from the Bayesian Network Reposi- tory3. The time limit was set to 600 seconds.... ..."

### Table 3. Learning Packages and Classes Package Class Extends

"... In PAGE 10: ... 4.1 Belief Learning Belief learning is modelled by probabilistic networks (also called Bayesian networks and belief networks), and I developed the Java classes shown in Table3 using the concepts and algorithms in Cowell et alia (1999) and Shafer (1996)2. Briefly, a probabilistic network is a directed acyclic graph in which nodes represent the random variables, an arrow from node X to node Y means that X has a direct influence on Y, and each dependent node has a conditional probability table.... ..."

### Table 1. Major types of learning methods1.

2000

"... In PAGE 2: ... Major types of learning include: concept learning (CL), decision trees (DT), artificial neural networks (ANN), Bayesian belief networks (BBN), reinforcement learning (RL), genetic algorithms (GA) and genetic programming (GP), instance-based learning (IBL), inductive logic programming (ILP), and analytical learning (AL). Table1 summarizes the main properties of different types of learning. Not surprisingly, machine learning methods can be (and some have already been) used in developing better tools or software products.... ..."

Cited by 3

### Table 1. Bayesian Networks Repository (left); SPOT5 benchmarks (right).

2005

"... In PAGE 9: ... The result of this process is a tree of hypergraph separators which is also a pseudo-tree of the original model since each separator corresponds to a subset of variables chained together. In Table1 we computed the height of the pseudo-tree obtained with the hypergraph and minfill heuristics for 10 belief networks from the UAI Repository2 and 10 constraint networks derived from the SPOT5 benchmark [17]. For each pseudo-tree we also com- puted the induced width of the elimination order obtained from the depth-first traversal of the tree.... ..."

Cited by 2