Results 1  10
of
24
Magnification control in selforganizing maps and neural gas
 NEURAL COMPUTATION
, 2006
"... We consider different ways to control the magnification in selforganizing maps (SOM) and neural gas (NG). Starting from early approaches of magnification control in vector quantization, we then concentrate on different approaches for SOM and NG. We show that three structurally similar approaches ca ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
We consider different ways to control the magnification in selforganizing maps (SOM) and neural gas (NG). Starting from early approaches of magnification control in vector quantization, we then concentrate on different approaches for SOM and NG. We show that three structurally similar approaches can be applied to both algorithms: localized learning, concaveconvex learning, and winner relaxing learning. Thereby, the approach of concaveconvex learning in SOM is extended to a more general description, whereas the concaveconvex learning for NG is new. In general, the control mechanisms generate only slightly different behavior comparing both neural algorithms. However, we emphasize that the NG results are valid for any data dimension, whereas in the SOM case the results hold only for the onedimensional case.
Expanding Selforganizing Map for Data Visualization and Cluster Analysis
 Information Sciences
, 2002
"... The SelfOrganizing Map (SOM) is a powerful tool in the exploratory phase of data mining. It is capable of projecting highdimensional data onto a regular, usually 2dimensional grid of neurons with good neighborhood preservation between two spaces. However, due to the dimensional conflict, the neig ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
(Show Context)
The SelfOrganizing Map (SOM) is a powerful tool in the exploratory phase of data mining. It is capable of projecting highdimensional data onto a regular, usually 2dimensional grid of neurons with good neighborhood preservation between two spaces. However, due to the dimensional conflict, the neighborhood preservation cannot always lead to perfect topology preservation. In this paper, we establish an Expanding SOM (ESOM) to preserve better topology between the two spaces. Besides the neighborhood relationship, our ESOM can detect and preserve an ordering relationship using an expanding mechanism. The computation complexity of the ESOM is comparable with that of the SOM. Our experiment results demonstrate that the ESOM constructs better mappings than the classic SOM, especially, in terms of the topological error. Furthermore, clustering results generated by the ESOM are more accurate than those obtained by the SOM.
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
Mobile Robot Local Navigation with a Polar Neural Map
, 1998
"... B Biographical Sketch .................................135 viii List of Tables Table 1 Algorithm for path construction. .......................41 Table 2 Algorithm for determining the next movement direction. . . . . ....41 Table 3 Computational complexity of the system. . . ...............99 ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
B Biographical Sketch .................................135 viii List of Tables Table 1 Algorithm for path construction. .......................41 Table 2 Algorithm for determining the next movement direction. . . . . ....41 Table 3 Computational complexity of the system. . . ...............99 ix List of Figures Figure 1 Several mobile robots of the class we consider in the thesis. .... 9 Figure 2 The concept of a neural map. . .......................20 Figure 3 Selforganization at the neural mapping (9) level. More neurons are assigned to the `important' areas of X. The mapping is shown by superimposing the neural field F on the signal space X. . . . . . 24 Figure 4 The basic nonlinear processing unit or neuron. . . ...........25 Figure 5 The architecture of the subsystem for path planning with neural maps. ...................................... 31 Figure 6 Different network topologies and connections for 2dimensional uniform coverage. . ..............................32 Figure 7 Connection weight as a function of distance. . . . ...........33 Figure 8 The nonlinear activation function. . . . ...................35 Figure 9 The target and obstacle configurations (left) and the contours of the equilibrium surface (right). . . . . ...................38 Figure 10 Network equilibrium state of a 50 50 neural map for a single target. ......................................39 Figure 11 The target and obstacle configurations (left) and the contours of the equilibrium surface (right). . . . . ...................39 Figure 12 Network equilibrium state of a 50 50 neural map for multiple targets. ......................................40 Figure 13 Update rasters on a 2dimensional lattice. . ...............46 Fi...
AutoSOM: Recursive Parameter Estimation for Guidance of SelfOrganizing Feature Maps
, 2001
"... this article we present the AutoSOM: a method for automatic parameter estimation in the SOM based on estimation by a linear Kalman #lter extended by a recursive parameter estimation method. We demonstrate its effectiveness on examples including a real application problem, and compare its performanc ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
this article we present the AutoSOM: a method for automatic parameter estimation in the SOM based on estimation by a linear Kalman #lter extended by a recursive parameter estimation method. We demonstrate its effectiveness on examples including a real application problem, and compare its performance with alternative versions of the SOM and with the GTM.
A SelfOrganizing Map with Expanding Force for Data Clustering and Visualization
 In: Proc. of ICDM’02
, 2002
"... The SelfOrganizing Map (SOM) is a powerful tool in the exploratory phase of data mining. However, due to the dimensional conflict, the neighborhood preservation cannot always lead to perfect topology preservation. In this paper, we establish an Expanding SOM (ESOM) to detect and preserve better top ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
The SelfOrganizing Map (SOM) is a powerful tool in the exploratory phase of data mining. However, due to the dimensional conflict, the neighborhood preservation cannot always lead to perfect topology preservation. In this paper, we establish an Expanding SOM (ESOM) to detect and preserve better topology correspondence between the two spaces. Our experiment results demonstrate that the ESOM constructs better mappings than the classic SOM in terms of both the topological and the quantization errors. Furthermore, clustering results generated by the ESOM are more accurate than those by the SOM.
SecondOrder Learning in SelfOrganizing Maps
 In Kohonen Maps. Pages 293–302. Elsevier Science
, 1999
"... Introduction The clever handling of control parameters plays an essential role in most learning algorithms. Manipulating the parameters in a convenient way not only may speed up the learning procedure itself but often is responsible for the success of the learning as such. In many learning procedur ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Introduction The clever handling of control parameters plays an essential role in most learning algorithms. Manipulating the parameters in a convenient way not only may speed up the learning procedure itself but often is responsible for the success of the learning as such. In many learning procedures of practical interest finding the correct learning parameters or cooling strategies is done by trial and error. In a few cases, strong theorems are known which formulate the parameter strategy in an explicit way an example being the RobinsMonroe theorem for the case of stochastic gradient descent. Other examples may be found in [13]. Since these theorems refer to the asymptotic time behavior they are of limited value for practical applications. There an initial learning period is often decisive on whether a meaningful solution is approached during the convergence phase or not. For this reason a variety of empirical parameterlearning procedures has been invented. The selflearning of th
Under consideration for publication in Knowledge and Information Systems Evolving Trees for the Retrieval of
, 2008
"... Abstract. In this paper we investigate the application of Evolving Trees for the analysis of mass spectrometric data of bacteria. Evolving Trees are extensions of SelfOrganizing Maps developed for hierarchical classification systems. Therefore they are well suited for taxonomic problems like the id ..."
Abstract
 Add to MetaCart
Abstract. In this paper we investigate the application of Evolving Trees for the analysis of mass spectrometric data of bacteria. Evolving Trees are extensions of SelfOrganizing Maps developed for hierarchical classification systems. Therefore they are well suited for taxonomic problems like the identification of bacteria. Here we focus on three topics, an appropriate preprocessing and encoding of the spectra, an adequate data model by means of a hierarchical Evolving Tree and an interpretable visualization. First the highdimensionality of the data is reduced by a compact representation. Here we employ sparse coding, specifically tailored for the processing of mass spectra. In the second step the topographic information which is expected in the fingerprints is used for advanced tree evaluation and analysis. We adapted the original topographic product for SelfOrganizingMaps for Evolving Trees to achieve a judgment of topography. Additionally we transferred the concept of Umatrix for evaluation of the separability of SelfOrganizingMaps to their analog in Evolving Trees. We demonstrate these extensions for two mass spectrometric data sets of bacteria fingerprints and show their classification and evaluation capabilities in comparison to state of the art techniques. 1.
unknown title
, 2001
"... Continuous latent variable models for dimensionality reduction and sequential data reconstruction by ..."
Abstract
 Add to MetaCart
(Show Context)
Continuous latent variable models for dimensionality reduction and sequential data reconstruction by