### Table 1: Implementation results for chaotic time-series prediction

1998

"... In PAGE 4: ...Matlab neural network toolbox and trained using conventional backpropagation algorithms. A summary of the implementation results obtained are presented in Table1 . Two different simulation approaches were used for the chaotic time series prediction problem.... In PAGE 4: ...ifference between the predicted and actual results, in terms of the prediction error is illustrated in Fig. 5. This compares favourably with a conventional fuzzy approach which employed an even finer-grained partitioning strategy ranging from 15 to 29 fuzzy sets to achieve a similar accuracy [Wang92]. For further comparison, the results using a conventional neural network approach which contains 40 nodes in the hidden layer are also included in Table1 . Previous work demonstrated that this size of network resulted in a similar degree of accuracy as a conventional fuzzy reasoning approach employing seven fuzzy sets on each input domain [Wang92].... In PAGE 4: ... Previous work demonstrated that this size of network resulted in a similar degree of accuracy as a conventional fuzzy reasoning approach employing seven fuzzy sets on each input domain [Wang92]. Table1 illustrates that the FNN approach provides a more accurate prediction of the time-series as compared to the conventional neural network approach. However, these results do not highlight that the training time of the conventional neural network was more than a factor of two slower than the largest FNN employed.... ..."

Cited by 2

### Table 1 Prediction results of traffic series measured in different time intervals (station 433)

"... In PAGE 7: ...nd another 24-hour workday data set for testing. At station N27.9, we also use 1,440 and 480 approach-base data points for 5-minute and 15-minute traffic series, respectively, thus a consecutive five-workday data set are selected for training and another consecutive five-workday data set for testing. The prediction results are summarized in Table1 and Table 2. From the tables, all of the RMSEs, including the training set and the testing set, are sufficiently small to show that the RBFNN model is highly satisfactory to predict the real-world short-interval flow, speed, and occupancy series.... ..."

### Table 2 Prediction results of traffic series measured in different time intervals (Station N27.9)

"... In PAGE 7: ...nd another 24-hour workday data set for testing. At station N27.9, we also use 1,440 and 480 approach-base data points for 5-minute and 15-minute traffic series, respectively, thus a consecutive five-workday data set are selected for training and another consecutive five-workday data set for testing. The prediction results are summarized in Table 1 and Table2 . From the tables, all of the RMSEs, including the training set and the testing set, are sufficiently small to show that the RBFNN model is highly satisfactory to predict the real-world short-interval flow, speed, and occupancy series.... ..."

### Table 3: Response time improvement of the hybrid approach over the other policies.

1998

"... In PAGE 16: ... \random polling quot;, and \random multicast quot; strategies. The improvement ratios of our method over the others are listed in Table3 for processing request set S2. The results for S3 are similar.... ..."

Cited by 24

### Table 3: Response time improvement of the hybrid approach over the other policies.

1998

"... In PAGE 15: ... We compare this strategy with those using \load broadcast quot;, \random polling quot;, and \random multicast quot;. The improvement ratios of our method over the others are listed in Table3 for processing request set S2. The results for S3 are similar.... ..."

Cited by 24

### Table 1. Response time improvement of the hybrid approach over the other policies.

1998

"... In PAGE 7: ... We com- pare this strategy with those using load broadcast , ran- dom poll , and random multicast . The improvement ra- tios of our method over the others are listed in Table1 for processing request set S2. The results for S3 are similar.... ..."

Cited by 24

### Table 8. Learning time of the hybrid approach for different numbers of PCs

2001

"... In PAGE 12: ...entioned in Sect.4.4 for six hidden units. Table8 shows the learning time for different numbers of PCs. It can be seen that the numbers of PCs for the best net- work training in our application depends on their total vari- ance.... ..."

Cited by 6

### Table 1. The average values of mean squared errors of single-step prediction for the sunspot time series data for fifty times by four algorithms

in A Modified Learning Algorithm Incorporating Additional Functional Constraints Into Neural Networks

"... In PAGE 5: ... 4-6 for above three learning algorithms. In order to statistically compare the prediction accuracies for sunspot data with the four algorithms (listed in Table1 and Table 2), experiment is done fifty times for each algorithm and then calculates its average accuracy value. The corresponding results are summarized in Table 1 and Table 2 for single-step prediction and iterative-step prediction.... In PAGE 5: ...hown in Figs. 4-6 for above three learning algorithms. In order to statistically compare the prediction accuracies for sunspot data with the four algorithms (listed in Table1 and Table 2), experiment is done fifty times for each algorithm and then calculates its average accuracy value. The corresponding results are summarized in Table1 and Table 2 for single-step prediction and iterative-step prediction. From these results, it can be seen apparently that the proposed learning algorithm has better generalization capability than the BP algorithm as well as the two original hybrid algorithms, because the mean squared errors of the modified algorithm for the testing data set is smaller than the ones for the other learning ones.... ..."

### Table 6. Comparisons of the approaches (Hybrid GA: genetic algorithm + local search; Hybrid VNS: iterated variable neighborhood search, and our approaches) on the problem. Hybrid GA Hybrid VNS Decomposition

2005

"... In PAGE 5: ...enalties are re-assigned and the descent local search is started again. See [10] for more details. Using the decomposition, construction and post-processing approach, we obtained a number of different schedules on the problem presented in Section 2. The best results out of 5 runs on each of the approaches, namely the hybrid genetic algorithm, the variable neighborhood search and our approach with and without the variable neighborhood search approach as the 3rd stage of post-processing, are presented in Table6 . The values in parentheses give the computational time of the corresponding approaches.... ..."

Cited by 3