### Table 3: Results of the risk of desertification for the 14 test sites using Rule-based reasoning and Bayesian network inference with GIS data

1998

"... In PAGE 31: ... Table 2 shows the results of the 39 sites using GIS data with uncertainty. Table3 shows the results of the 14 test sites using GIS data with uncertainty. Using GIS data the system agreed with the expert on 28 out of the 39 training sites, with the majority of the misclassified sites falling into the adjacent classes.... ..."

Cited by 6

### Table 2: Results of the risk of desertification for the 39 training sites using Rule- based reasoning and Bayesian network inference with GIS data

1998

"... In PAGE 31: ... Out of the 53 available sites, 39 sites were used for training and 14 sites were used for testing the final system. Table2 shows the results of the 39 sites using GIS data with uncertainty. Table 3 shows the results of the 14 test sites using GIS data with uncertainty.... ..."

Cited by 6

### Table 6: Non-zero with Covariate Parameter Estimated Classical Inference Bayesian Method 95% Credible Set

1996

"... In PAGE 12: ... Table 5 examines the log-normal situation with covariate and in this set-up the agreement between the classical and Bayesian estimates for the regression parameters is better. Table6 concludes the analysis with similar results as above with 6 = 0. Classical estimates presented in this Table use ideas to t each ratio to normality as opposed to yielding (joint) g-dimensional multivariate normality.... ..."

Cited by 1

### TABLE III INCREMENTAL BAYESIAN ADAPTIVE INFERENCE PERFORMANCE ON THE COMPLETE DATA SET

### Table 2. Individual-specific parameters used in the model Surface Basal

"... In PAGE 5: ... Tollit, personal communication). The values used for the individual-specific parameters are presented in Table2 and the sources given in the next section. Where a measurement of length was available, it was used directly.... ..."

### Table 1. Prediction errors (NRMSE) for the data using the Bayesian Network in Fig. 2

"... In PAGE 9: ... 1, and run inference on the second half of the data. Table1 shows the normalised root mean square error (NRMSE) of the inference. NRMSE gives a useful scale-independent measure of error between data sets of different ranges.... ..."

### Table 2. Symbols Used to Annotate Flow Diagrams

1998

"... In PAGE 23: ...Table2 , indicate the source of the annotations. The example in Figure 12 uses all symbols except for the ? symbol.... ..."

Cited by 4

### Table 5. MDL and Bayesian

2002

"... In PAGE 4: ...(7) from [3] [4]. The experimental results, as shown in Table5 , confirmed that the model selection using our Bayesian criterion re- sulted in better word recognition rates compared with that using the MDL criterion, especially in the case of small amounts of training data. Table 4.... ..."

Cited by 4

### Table 5. MDL and Bayesian

2002

"... In PAGE 4: ...(7) from [3] [4]. The experimental results, as shown in Table5 , confirmed that the model selection using our Bayesian criterion re- sulted in better word recognition rates compared with that using the MDL criterion, especially in the case of small amounts of training data. Table 4.... ..."

Cited by 4

### Table 2: Bayesian model averaging, Bayesian model selection, and constrain-based results for an analysis of whether \X causes Z quot; given data summarized in Table 1. number of output of output of

1997

"... In PAGE 11: ... Table 1: A summary of data used in the example. number su cient statistics of cases x y z x yz xy z xyz x y z x yz xy z xyz 150 5 36 38 15 7 16 23 10 250 10 60 51 27 15 25 41 21 500 23 121 103 67 19 44 79 44 1000 44 242 222 152 51 80 134 75 2000 88 476 431 311 105 180 264 145 The rst two columns in Table2 shows the results of applying Equation 4 under the assumptions stated above for the rst N cases in the data set. When N = 0, the data set is empty, in which case probability of hypothesis h is just the prior probability of \X causes Z quot;: 8/25=0.... In PAGE 11: ...32. Table2 shows that as the number of cases in the database increases, the probability that \X causes Z quot; increases monotonically as the number of cases increases. Although not shown, the probability increases toward 1 as the number of cases increases beyond 2000.... In PAGE 11: ... Although not shown, the probability increases toward 1 as the number of cases increases beyond 2000. Column 3 in Table2 shows the results of applying Bayesian model selection. Here, we list the causal relationship(s) between X and Z found in the model or models with the highest posterior probability p(mjD).... In PAGE 11: ... Two of the models have Z as a cause of X; and one has X as a cause of Z. Column 4 in Table2 shows the results of applying the PC constraint-based causal discov- ery algorithm (Spirtes et al., 1993), which is part of the Tetrad II system (Scheines et al.... ..."

Cited by 54