Results 1 - 10
of
84,700
Table 3 The proportion of estimated OFV points to the real OFV points after correlation based selection and Bayesian iteration.
"... In PAGE 4: ... In correlation based method OFV N feature points were selected with maximal correlation. Table3 shows the change of this ratio after correlation based selection and Bayesian iteration. Table 2 and 3 clearly shows that after the iterations we can take a useful estimation about the OFV, while other initial statistical values do not concentrate in the OFV.... ..."
Table VII. Features Selected with a Correlation-Based Feature Selection Technique 1 Telephone (Count-30)
2003
Cited by 130
TABLE III NOTATIONS FOR CORRELATION-BASED SAMPLER SELECTION AND MODEL DERIVATION
Cited by 1
Table 5.4: Feature sets used in experiments based on LogitBoost classification with 10- fold stratified cross-validation. Correct designates the accuracy % rate and variance. # Fea. = number of features, Correlation-based FSS = Correlation-based Feature Subset Selection
in Adviser
2005
Table 4: Experimental results for attribute weighted naive Bayes (AWNB) ver- sus naive Bayes with gain ratio based weighting (GRW), naive Bayes with Re- liefF based weighting (RW), naive Bayes with correlation-based feature selec- tion (CFS), Selective Bayes (SB), the Selective Bayesian classifier (SBC) and NBTree: mean root relative squared error (RRSE) and standard deviation. Data AWNB GRW RW CFS SB SBC NBTree
2006
"... In PAGE 8: ... Compared to bagged unpruned decision trees, AWNB is significantly better on one data set and significantly worse on three. Table4 shows the RRSE results for the second experiment. In this exper- iment we compared AWNB against two other attribute weighting schemes for naive Bayes, three feature selection methods and naive Bayes trees.... In PAGE 8: ... For the purpose of computing these weights, all numeric attributes are discretized in a copy of each train/test split using the supervised discretization method of Fayyad and Irani [8]. Compared to GRW, we can see from Table4 that our tree-based method for determining... In PAGE 9: ... However, for the purposes of naive Bayes this can result in scores that are too high for attributes with many dependencies. Columns five through seven of Table4 show the results for naive Bayes when combined with the three feature selection algorithms. Correlation-based feature selection (CFS) [10] is particularly well suited for use with naive Bayes as its evaluation heuristic prefers subsets of attributes with low levels of redundancy.... In PAGE 9: ... The Selective Bayesian Classifier (SBC) is a bagged decision-tree based attribute selection filter for naive Bayes [21]. From Table4 we can see that SBC is significantly better than AWNB on two data sets and significantly worse on 12. The last column in the table shows the results for naive Bayes... ..."
TABLE I FEATURE RANKING FOR VARIOUS TECHNIQUES: INFORMATION GAIN (IG), GAIN RATIO (GR), SYMMETRICAL UNCERTAINTY (SU), CORRELATION-BASED FEATURE SELECTION (CFS), SUPPORT VECTOR MACHINE RECURSIVE FEATURE ELIMINATION (SVM- RFE), AND THE GENUINE FEATURE SET (REF)
Table 3. Correlation based neighborhood of the survival time (TTS)
"... In PAGE 4: ... In contrast, a standard, naive approach, which simply selects a neighborhood on the basis of the absolute values of the correlations between gene expression profile and survival time, leads to a neigh- borhood with fewer cancer- and neuron-related genes. Out of the 20 most highly correlated probe sets in Table3 , only 4 are related to neuron cells and only 6 are related to cancer. Comparing Tables 2 and 3 provides indirect empirical evidence that the MTOM neigh- borhood analysis leads to biologically more meaningful results than the standard approach in this application.... ..."
Table 5.5: Detailed accuracy of each class analyzing the 78 features from all feature sets chosen by Correlation-based Feature Subset Selection. LogitBoost correctly classifed 314 out of 343 instances (91.5%) with stratified 10-fold cross-validation. TP is the true- positive rate, also known as hits - the number of correctly classified documents within that genre. FP is the false-positive rate - number of documents incorrectly classified to that particular class.
in Adviser
2005
Table 2: The basic learning schemes in Weka
"... In PAGE 2: ... The Weka system provides three feature selection systems: a locally produced correlation based technique [3], the wrapper method and Relief [4]. Learning schemes Weka contains implementations of many algorithms for classification and numeric prediction, the most important of which are listed in Table2 . Numeric prediction is interpreted as prediction of a continuous class.... ..."
Table 1: Comparison of the three recommendation meth- ods: movie average (MA), correlation-based method (CB), and opinion diffusion (OD). Presented values are averages obtained using 10 different probes; standard deviations are approxi- mately 0.01 in all investigated cases.
710
Results 1 - 10
of
84,700