### Table 1. Algorithms for self-organizing modeling

"... In PAGE 2: ... nonparametric models Known nonparametric model selection methods include: Analog Complexing (AC) which selects nonparametric prediction models from a given data set representing one or more patterns of a trajectory of past behavior which are analogous to a chosen reference pattern and Objective Cluster Analysis (OCA). Table1 shows some data mining functions and more appropriate self-organizing modeling algorithms for addressing these functions. Table 1.... ..."

### TABLE 3. Parameters of the self-organization algorithm.

in Efficient publish/subscribe through a self-organizing broker overlay and its application to SIENA

Cited by 1

### Table I:. Algorithms for self-organizing modeling

### Table 1. Packet description for the Kohonen Self-Organizing Map implementation.

2002

"... In PAGE 3: ... After having read the sensor values, each unit compares those values with its internal values, stored in the randomly initialised prototype vectors and calculates the Euclidean distance between both vectors. A packet is then created and broadcast across the network with the elements as they are listed in Table1 . The timestamp is provided to eliminate outdated packages.... ..."

Cited by 9

### Table 6.1: Contributions to the state of the art in ant-based clustering algorithms

### Table 1. Algorithms for self-organizing modeling (see [19] for such classification) Data Mining functions Algorithm

"... In PAGE 3: ... In a wider sense, the spectrum of self-organizing modeling contains regression-based methods, rule-based methods, symbolic modeling and nonparametric model selection methods. Table1 shows some data mining functions and more appropriate SOM algorithms for addressing these functions (FRI: Fuzzy rule induction using GMDH, AC: Analog Complexing). Table 1.... ..."

### Table 4: Absolute IC difference before self-organization between sequential (SEQ) and random (RND) initialization.

"... In PAGE 7: ...3.1 Results and Discussion Table4 distinctly shows an improvement in absolute interference cost (IC) reduction across the wireless mesh region for different node densities- This improvement is obtained by using the proposed random initialization algorithm in comparison to the sequential algorithm, which translates to an improvement in the overall capacity. Table 4: Absolute IC difference before self-organization between sequential (SEQ) and random (RND) initialization.... ..."

### Table 1: Mean square error for neuron weights and stan- dard deviation for probability density estimates ( ) 6 Conclusions A new integrally distributed self-organizing learning al- gorithm for the class of neural networks introduced by Kohonen [1] was presented. The algorithm converges to an equiproblable topology preserving map for arbitrarily distributed input signals. It is shown that Kohonen apos;s al- gorthim converges to a locally a ne self-organizing map. Similations results agree with theoretical predictions.

"... In PAGE 5: ... As expected, the results of the three algo- rithms are fairly similar, for the case of a uniformly dis- tributed input signal. Table1 contains the mean square error for the neuron weights and the standard deviation of the probability density estimate vector, ^ p, for both simu- lations.It should be noted that the improvement in perfor- mance comes at an increase in computational cost.... ..."

### TABLE III COMPARISON OF THE MOTIF SETS FROM SELF-ORGANIZING NEURAL NETWORK METHOD AND MEME METHOD, NOS = NUMBER OF SAMPLES IN THE MOTIF SET.

Cited by 1