### Table 1. Algorithms for self-organizing modeling

"... In PAGE 2: ... nonparametric models Known nonparametric model selection methods include: Analog Complexing (AC) which selects nonparametric prediction models from a given data set representing one or more patterns of a trajectory of past behavior which are analogous to a chosen reference pattern and Objective Cluster Analysis (OCA). Table1 shows some data mining functions and more appropriate self-organizing modeling algorithms for addressing these functions. Table 1.... ..."

### Table 1. Algorithms for self-organizing modeling (see [19] for such classification) Data Mining functions Algorithm

"... In PAGE 3: ... In a wider sense, the spectrum of self-organizing modeling contains regression-based methods, rule-based methods, symbolic modeling and nonparametric model selection methods. Table1 shows some data mining functions and more appropriate SOM algorithms for addressing these functions (FRI: Fuzzy rule induction using GMDH, AC: Analog Complexing). Table 1.... ..."

### Table 1: Effectiveness of Pre-processing

"... In PAGE 8: ... Normalisation of the second stage of pre-processing, proved to have a significant impact on the results of the neural network. Without normalisation the network predicted results with a lower accuracy of around 55% and narrow standard deviation of approximately 3% ( Table1 ). By contrast with normalisation results gave a much higher average accuracy in the range of 73% but with a wider standard deviation of approximately 15%.... In PAGE 12: ...5 Mechanical Averaged-Based Prediction Mechanical prediction was carried out to verify if the network was simply carrying on the prevailing trend as described in 3.4 ( Table1 0). Average Type Average Success (%) 1-day 51.... In PAGE 12: ...32 15-day 57.53 Table1 0: Simple Average Trend Prediction ... ..."

### Table 7: Kohonen self-organizing feature map of the iris data.

in Input Data Coding in Multivariate Data Analysis: Techniques and Practice in Correspondence Analysis

"... In PAGE 11: ..., 1997). Table7 shows a Kohonen map of the original Fisher iris data. The user can trace a curve separating observation sequence numbers less than or equal to 50 (class 1), from 51 to 100 (class 2), and above 101 (class 3).... In PAGE 11: ... The zero values indicate map nodes or units with no assigned observations. The map of Table7 , as for Table 8, has 20 20 units. The number of epochs used in training was in both cases 133.... In PAGE 11: ...ersion of the Fisher data. The user in this case can demarcate class 1 observations. Classes 2 and 3 are more confused, even if large contiguous \islands quot; can be found for both of these classes. The result shown in Table 8 is degraded compared to Table7 . It is not badly degraded though insofar as sub-classes of classes 2 and 3 are found in adjacent and contiguous areas.... ..."

### Table 1. Packet description for the Kohonen Self-Organizing Map implementation.

2002

"... In PAGE 3: ... After having read the sensor values, each unit compares those values with its internal values, stored in the randomly initialised prototype vectors and calculates the Euclidean distance between both vectors. A packet is then created and broadcast across the network with the elements as they are listed in Table1 . The timestamp is provided to eliminate outdated packages.... ..."

Cited by 9

### Table 6. Robustness properties of distance functions used in self-organizing maps quantified in terms of the values of V (the results shown here are averaged over 100 experiments)

"... In PAGE 25: ... We regard it as a function of the standard deviation of the noise being added to the data V(e) = g229 g229 g61 g61 g45 g43 g45 g61 g45 N 1 k N 1 k } ) k ( apos; j ) k ( j ) k ( apos; i ) k ( i { apos; I I where (i apos;(k),j apos;(k)) is the location of the distorted (noisy) version of the k-th data point. It becomes evident, Table6 , that the Hamming distance yields a high level of robustness as the values of V go up at a lower rate as it happens for the two other distance functions. 5 Building information granules of software measures In the previous section, we have showed that the clusters grown through the merging of the similar entries of the map consist of a certain number of software modules and exhibit certain diversity in the values of the software measures.... ..."

Cited by 1

### TABLE I THE EFFECT OF NETWORK SIZE FOR THE SELF-ORGANIZING MAP (SOM) The SOM size versus detection results for three different sizes

### Table 3: Error rate of the face recognition system with varying number of dimensions in the self-organizing map. Each result given is the average of three simulations.

1997

Cited by 103

### Table 3: Error rate of the face recognition system with varying number of dimensions in the self-organizing map. Each result given is the average of three simulations.

1997

Cited by 103

### Table 3: Error rate of the face recognition system with varying number of dimensions in the self-organizing map. Each result given is the average of three simulations.

1996

Cited by 15