### Table 6: The per-class RI measures for various data sets for supervised dimensionality reduction.

2000

"... In PAGE 14: ... To illustrate this, we used the same set of data sets as in the previous section, but this time we used the centroid of the various classes as the axes of the reduced dimensionality space. The RI measures for the different classes in each one of these data sets are shown in Table6 . Note that the number of dimension in the reduced... ..."

Cited by 50

### Table 6: The per-class RI measures for various data sets for supervised dimensionality reduction.

2000

"... In PAGE 13: ... To illustrate this, we used the same set of data sets as in the previous section, but this time we used the centroid of the various classes as the axes of the reduced dimensionality space. The RI measures for the different classes in each one of these data sets are shown in Table6 . Note that the number of dimension in the reduced... ..."

Cited by 50

### Table 6: The per-class RI measures for various data sets for supervised dimensionality reduction.

2000

"... In PAGE 13: ... To illustrate this, we used the same set of data sets as in the previous section, but this time we used the centroid of the various classes as the axes of the reduced dimensionality space. The RI measures for the different classes in each one of these data sets are shown in Table6 . Note that the number of dimension in the reduced... ..."

Cited by 50

### Table 6: The per-class RI measures for various data sets for supervised dimensionality reduction.

2000

"... In PAGE 13: ... To illustrate this, we used the same set of data sets as in the previous section, but this time we used the centroid of the various classes as the axes of the reduced dimensionality space. The RI measures for the different classes in each one of these data sets are shown in Table6 . Note that the number of dimension in the reduced... ..."

Cited by 50

### Table 1: FV spaces with supervised discrimination benchmark scores (R-precision) and unsupervised image-analysis scores.

"... In PAGE 5: ... The individual FV spaces possess varying average discrimination power - some FV spaces work well for similarity searching, oth- ers perform poorer. Table1 gives the used FV space names (FV name), along with respective FV dimen- sionalities (dim.) and R-precision (R-prec.... In PAGE 5: ... 5.3 Results and Comparison Table1 lists the dtb and the E scores for each of the 12 FV space representations of the PSB-T bench- mark. By their definition, increasing score values in- dicate increasing component heterogeneity.... ..."

### Table 6: Discriminating measures

"... In PAGE 19: ...18 drop very quickly when we pass from a solution with two dimensions to a higher dimensionality solution. The analysis proceeds with the identification of the more important variables for each dimension ( Table6 ). In fact, an analysis of the discriminating measures in each dimension reveals that the OECD and Pavitt taxonomies contribute toward an explanation of the first dimension, whereas cultural proximity, geographical distance and project technology diversity explain the second one.... ..."

### Table 3. Results of Discriminant Function Analysis

2003

"... In PAGE 18: ... The seven sets of variables used for classification were: three reduced sets (six geometric variables, eight texture variables and 14 geometric + texture variables); all 102 variables (12 geometric variables and 90 texture variables) and three sets of PC scores (six geometric PC scores, eight texture PC scores and 14 geometric + texture PC scores). Table3 shows classification results. In the analysis, higher overall classification rate was observed with prior probabilities calculated from group sizes.... In PAGE 18: ... The original data set was used to observe the effect of data reduction on classification. The results are shown in Table3 . Classification with all variables (12 geometric and 90 texture) produced 73% accuracy but the classification was higher (79%) with the selected set of variables.... ..."

### Table 1: Error rates for SVM, k-NN and QDA used to compare effectiveness of PCA and Isomap for supervised classification analysis of the cytokine compendium (computed by 10-fold cross validation with 95% confidence intervals)

2007

"... In PAGE 10: ...ttp://www.biomedcentral.com/1752-0509/1/27 sity as class labels and original 19-dimesnional space of molecular signals, Isomap components space and PCA components space as input datasets. We assigned class labels representing 3 levels of apoptosis intensity (low, medium and high) to each time point using expectation maximization clustering and performed classification with support vector machines (SVM), k-NN or quadratic discriminant analysis (QDA) in three dimensional Iso- map/PCA subspace and original multidimensional space ( Table1 ). K-NN and QDA classifiers based on Isomap dimensions showed comparable performance to classifi- ers that used original dataset and in both cases were better than PCA-based classifiers (ANOVA, p lt; 0.... ..."

### Table 1: Dimension reduction strategies of the compared methods.

2007

"... In PAGE 12: ... Unsupervised approaches make no use of survival data in the dimension reduction. In Table1 we indicate which dimension reduction techniques are exploited by the methods in the comparison to handle the high-dimensionality of the data. This is done to facilitate a comparison of the methods on a conceptual level, but also to interpret performance differences on simulated and real-life data later.... In PAGE 12: ... This is done to facilitate a comparison of the methods on a conceptual level, but also to interpret performance differences on simulated and real-life data later. For some methods Table1 does need explanation. Supervised principal component analysis [14] is an element of every group, because it contains two dimension reduction steps.... In PAGE 13: ...From Table1 it is evident that all methods are (at least partially) supervised approaches, and do their dimension reduction mostly multivariately. Within those characteristics it is the feature selection/extraction that sets the methods apart.... ..."

Cited by 1

### Table 11. Discriminant Analysis

"... In PAGE 8: ... Table11 shows that the relative code churn measures have effective discriminant ability (comparable to prior studies done on industrial software [13]). We conclude that relative code churn measures can be used to discriminate between fault and not fault-prone binaries (H4).... ..."