### TABLE V LOCAL LINEAR CORRELATIONS EMBEDDED IN THE DATASET

### Table 4: Test IV, Classification accuracy in in the full space and in the reduced spaces obtained by the Cen- troid algorithm and the Orthogonal Centroid algorithm. Classification was performed using the Centroid- based classification algorithm.

2003

Cited by 26

### Table 3: Test III, classification accuracy from two classification algorithms, the Centroid-based classification as well as the k nearest neighbor on the data in the full space and in the reduced spaces obtained by the Centroid algorithm and the Orthogonal Centroid algorithm.

2003

"... In PAGE 15: ... The computational time for the knn algorithm is dramatically reduced after the dimension reduction since the comparisons between the data to be classified and all other data points are now made in the 5-dimensional space, instead of in the full 22095-dimensional space. Table3 shows that the classification results with both measures of C4BE norm and cosine are identical in the full and reduced dimensional space obtained from the Orthogonal Centroid method. The results for various values of CZA3 with the C4BE norm in knn were much worse than those with the cosine measure.... ..."

Cited by 26

### Table 1: The prediction accuracies are shown. The first part (RBF to KFD) is from [15]: classifi- cation accuracy from a single RBF classifier(RBF), AdaBoost(AB), regularized AdaBoost, SVM and KFD. The last two columns are from the Kernel Orthogonal Centroid method using Gaussian kernels (optimal AR values shown) and a polynomial kernel of degree 3. For each test, the best prediction accuracy result is shown in boldface.

"... In PAGE 15: ...the optimal dimension reduction criterion [22]. In Table1 , we present the implementation results on seven data sets which Mika et al. have used in their tests1 [33].... In PAGE 15: ... Parameters for the best candidate for the kernel function and SVM are determined based on a 5 fold cross-validation using the first five training sets. We repeat their results in the first five columns of Table1 which show the prediction accuracies in percentage (%) from the RBF classifier(RBF), AdaBoost(AB), regularized AdaBoost, SVM and KFD. For more details, see [15].... In PAGE 15: ...VM and KFD. For more details, see [15]. The results shown in the column for KOC are obtained from the linear soft margin SVM classification using the software D7DAD1D0CXCVCWD8 [34] after dimension reduction by KOC. The test results with the polynomial kernel with degree 3 and the Gaussian kernel with an optimal AR value for each data set are presented in Table1 . The results show that our method obtained comparable accuracy to other methods in all the tests we performed.... ..."

Cited by 2

### Table 1: The prediction accuracies are shown. The first part (RBF to KFD) is from [15]: classifi- cation accuracy from a single RBF classifier(RBF), AdaBoost(AB), regularized AdaBoost, SVM and KFD. The last two columns are from the Kernel Orthogonal Centroid method using Gaussian kernels (optimal AR values shown) and a polynomial kernel of degree 3. For each test, the best prediction accuracy result is shown in boldface.

2002

"... In PAGE 14: ...the optimal dimension reduction criterion [22]. In Table1 , we present the implementation results on seven data sets which Mika et al. have used in their tests1 [33].... In PAGE 14: ... Parameters for the best candidate for the kernel function and SVM are determined based on a 5 fold cross-validation using the first five training sets. We repeat their results in the first five columns of Table1 which show the prediction accuracies in percentage (%) from the RBF classifier(RBF), AdaBoost(AB), regularized AdaBoost, SVM and KFD. For more details, see [15].... In PAGE 14: ...VM and KFD. For more details, see [15]. The results shown in the column for KOC are obtained from the linear soft margin SVM classification using the software D7DAD1D0CXCVCWD8 [34] after dimension reduction by KOC. The test results with the polynomial kernel with degree 3 and the Gaussian kernel with an optimal AR value for each data set are presented in Table1 . The results show that our method obtained comparable accuracy to other methods in all the tests we performed.... ..."

### Table 6:Statistical summary for orthogonality Orthogonality

"... In PAGE 9: ...1.3 Orthogonality Based on Table6 , which data are again not yet disaggregated with respect to expertise, orthogonality is 97%, while variance is 2. This expresses that, in the average, programmers commonly percept ODC with respect, and tend to provide just one classification per defect, whatever is their expertise.... ..."

### Table 1. Linear Representations Under Orthogonality Constraints

2003

Cited by 5

### Table 1: Linear Representations Under Orthogonality Constraints

### Table 5 Errors with the centroid rule

1998

"... In PAGE 26: ...s consistent with a rate of convergence of O(h2), as predicted in Theorem 4.1. We also illustrate the convergence to be expected when the collocation method is based on the centroid rule of (38). The problem being solved is the same as when the surface is only piecewise smooth; and Table5 contains the numerical results. Note that the linear system being solved has order n, in contrast with that based on linear interpolation and having order 3n.... ..."

Cited by 10

### Table 5: Results of local info and confusion matrix centroid selection

1993

"... In PAGE 7: ... We considered these cases penalties because the process was unable to equate a misspelling to a single centroid. The rst two columns of Table5 show a set of multiply clustered misspellings and their corresponding centroids.One of the strategies to identify the correct cluster for such misspellings is to rst locate the document in which the misspelling occurs.... In PAGE 7: ... This rst strategy, which we call local info, cannot always identify the correct centroid. For example, the correct centroid for the misspelling transporation cannot be determined ( Table5 ). This correction cannot be made because in the document in which the misspelling occurs, more than one of the misspelling apos;s centroids occurs in the document.... ..."

Cited by 18