### Table 1: A Comparison Among Dimensionality Re- duction Techniques

"... In PAGE 1: ... The proposed dimension- ality reduction techniques include Singular Value Decompo- sition (SVD) [13, 18], Discrete Fourier Transform (DFT) [24], Discrete Wavelet Transform (DWT) [5, 23, 12, 27], Piecewise Linear Approximation (PLA) [20, 17], Piecewise Aggregate Approximation (PAA) [15, 29], Adaptive Piece- wise Constant Approximation (APCA) [16] and Chebyshev Polynomials (CP) [4]. We list seven popular dimensionality reduction techniques in Table1 , in terms of the time complexity, space complex- ity, and capability to be indexed in the reduced space, where n is the length of each time series, N is the total number of time series in the database, and (2m) is the reduced dimen- sionality. In terms of the time complexity, CP is more costly than PLA, DWT, PAA and APCA (Table 1); and PLA is... In PAGE 1: ... We list seven popular dimensionality reduction techniques in Table 1, in terms of the time complexity, space complex- ity, and capability to be indexed in the reduced space, where n is the length of each time series, N is the total number of time series in the database, and (2m) is the reduced dimen- sionality. In terms of the time complexity, CP is more costly than PLA, DWT, PAA and APCA ( Table1 ); and PLA is... ..."

### Table 1: A Comparison Among Dimensionality Re- duction Techniques

"... In PAGE 1: ... The proposed dimension- ality reduction techniques include Singular Value Decompo- sition (SVD) [13, 18], Discrete Fourier Transform (DFT) [24], Discrete Wavelet Transform (DWT) [5, 23, 12, 27], Piecewise Linear Approximation (PLA) [20, 17], Piecewise Aggregate Approximation (PAA) [15, 29], Adaptive Piece- wise Constant Approximation (APCA) [16] and Chebyshev Polynomials (CP) [4]. We list seven popular dimensionality reduction techniques in Table1 , in terms of the time complexity, space complex- ity, and capability to be indexed in the reduced space, where n is the length of each time series, N is the total number of time series in the database, and (2m) is the reduced dimen- sionality. In terms of the time complexity, CP is more costly... In PAGE 1: ... We list seven popular dimensionality reduction techniques in Table 1, in terms of the time complexity, space complex- ity, and capability to be indexed in the reduced space, where n is the length of each time series, N is the total number of time series in the database, and (2m) is the reduced dimen- sionality. In terms of the time complexity, CP is more costly than PLA, DWT, PAA and APCA ( Table1... ..."

### Table 5. Dimensionality reduction techniques that can be used with E-Index to index time series. Key: {} the introducing paper, [] extensions and follow up work.

2001

Cited by 2

### Table 5. Dimensionality reduction techniques that can be used with E-Index to index time series. Key: {} the introducing paper, [] extensions and follow up work.

2001

Cited by 2

### Table 4 shows the same veri cation results for the second series of experiments described in Section 5.2. We remark again a quasi-linear variation of the reduction and comparison times with the number of retransmissions.

1996

"... In PAGE 19: ... Table4 : Veri cation using bisimulations for data packets with random lengths between 1 and 10 For both series of experiments, the total time required for veri cation by means of bisimulations is signi cantly smaller than the time required for Lts generation. 7 Veri cation using temporal logics As an alternative to veri cation using bisimulations, we also performed veri cation using temporal logics.... ..."

Cited by 16

### Table 4 shows the same veri cation results for the second series of experiments described in Section 5.2. We remark again a quasi-linear variation of the reduction and comparison times with the number of retransmissions.

1996

"... In PAGE 20: ... Table4 : Veri cation using bisimulations for data packets with random lengths between 1 and 10 For both series of experiments, the total time required for veri cation by means of bisimulations is signi cantly smaller than the time required for Lts generation. 7 Veri cation using temporal logics As an alternative to veri cation using bisimulations, we also performed veri cation using temporal logics.... ..."

Cited by 16

### Table 4. Dimensionality reduction time taken by each manifold method (seconds)

"... In PAGE 11: ... The technique of cross validation was applied to split the microarray data sets into training and testing data sets. Table4 shows the times needed for each manifold method to reduce the dimensionality of the data sets. As seen before, the PCA method produces more dimensions than the LTSA.... ..."

### Table 1. Linear Time Series Models Model Description

2006

"... In PAGE 10: ... temporal reliability and measured their prediction accuracy. The tested time series models are shown in Table1 . In this experiment, we used the training and the test sets of equal size.... ..."

Cited by 2

### Table 1: Experimental results for three datasets (Upper part with all variables; Lower part with reduction variables)

"... In PAGE 2: ...Table1 ); (iii) A powerful feature se- lection procedure has been implemented with K-PLS that is fully benchmarked and ranked well in the 2003 NIPS feature selection challenge [8]. 2 Variable Selection with Random Forests Dimensionality reduction is a challenging problem for supervised and unsuper- vised machine learning for classification, regression, and time series prediction.... In PAGE 5: ... Note that for both data sets only two features were dropped in order to maintain similar performance metrics for the reduced variable set. RF variable selection for both benchmark datasets was based on the linear K-PLS model as shown in Table1 . Because leave-one-out validation is used for all training models, the performance metrics have a low variance.... In PAGE 5: ... 10, 000 Random Forests models are used for 40, 50, 60, and 70 variables re- spectively. The variable ranking is relatively robust with the number of selected variables in the RF as shown in Table1 . In the final model, the 7 variables with the lowest scores are discarded, maintaining a similar Q2/q2 performance as for the original 74 variable model.... In PAGE 5: ... For the Boston Housing, South African Heart disease and MCG data, 12, 5 and 5 Latent Variables (LVs) were used. Deleted variables are listed in the last column of Table1 . Table 1 shows that Random Forests results outperform Z-scores ranking and RF are close to those obtained from Sensitivity Analysis.... ..."

### Table 2: Summary of performance results of each dimensionality reduction method

1995

"... In PAGE 32: ... 0 2 4 6 Actual distances 2 4 6 8 Figure 21: Fish data set : Scatter-plot for non-metric MDS 6 Discussion The results of the previous experiments are summarised in Table2 and depicted in Figures 22 and 23. Also included are the results for the linear PCA method.... ..."

Cited by 5