Results 1  10
of
140
A Growing Neural Gas Network Learns Topologies
 Advances in Neural Information Processing Systems 7
, 1995
"... An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebblike learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this m ..."
Abstract

Cited by 322 (5 self)
 Add to MetaCart
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebblike learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this model has no parameters which change over time and is able to continue learning, adding units and connections, until a performance criterion has been met. Applications of the model include vector quantization, clustering, and interpolation. 1 INTRODUCTION In unsupervised learning settings only input data is available but no information on the desired output. What can the goal of learning be in this situation? One possible objective is dimensionality reduction: finding a lowdimensional subspace of the input vector space containing most or all of the input data. Linear subspaces with this property can be computed directly by principal component analysis or iteratively with a number of network models (S...
Growing Cell Structures  A Selforganizing Network for Unsupervised and Supervised Learning
 Neural Networks
, 1993
"... We present a new selforganizing neural network model having two variants. The first variant performs unsupervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the m ..."
Abstract

Cited by 275 (11 self)
 Add to MetaCart
We present a new selforganizing neural network model having two variants. The first variant performs unsupervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches, e.g., the Kohonen feature map, is the ability of the model to automatically find a suitable network structure and size. This is achieved through a controlled growth process which also includes occasional removal of units. The second variant of the model is a supervised learning method which results from the combination of the abovementioned selforganizing network with the radial basis function (RBF) approach. In this model it is possible  in contrast to earlier approaches  to perform the positioning of the RBF units and the supervised training of the weights in parallel. Therefore, the current classification error can be used to determine where to insert new RBF units. This leads to small networks which generalize very well. Results on the t...
Clustering of the SelfOrganizing Map
, 2000
"... The selforganizing map (SOM) is an excellent tool in exploratory phase of data mining. It projects input space on prototypes of a lowdimensional regular grid that can be effectively utilized to visualize and explore properties of the data. When the number of SOM units is large, to facilitate quant ..."
Abstract

Cited by 192 (1 self)
 Add to MetaCart
The selforganizing map (SOM) is an excellent tool in exploratory phase of data mining. It projects input space on prototypes of a lowdimensional regular grid that can be effectively utilized to visualize and explore properties of the data. When the number of SOM units is large, to facilitate quantitative analysis of the map and the data, similar units need to be grouped, i.e., clustered. In this paper, different approaches to clustering of the SOM are considered. In particular, the use of hierarchical agglomerative clustering and partitive clustering usingmeans are investigated. The twostage procedurefirst using SOM to produce the prototypes that are then clustered in the second stageis found to perform well when compared with direct clustering of the data and to reduce the computation time.
Curvilinear Component Analysis: A SelfOrganizing Neural Network for Nonlinear Mapping of Data Sets
, 1997
"... We present a new strategy called “curvilinear component analysis” (CCA) for dimensionality reduction and representation of multidimensional data sets. The principle of CCA is a selforganized neural network performing two tasks: vector quantization (VQ) of the submanifold in the data set (input spac ..."
Abstract

Cited by 168 (1 self)
 Add to MetaCart
We present a new strategy called “curvilinear component analysis” (CCA) for dimensionality reduction and representation of multidimensional data sets. The principle of CCA is a selforganized neural network performing two tasks: vector quantization (VQ) of the submanifold in the data set (input space) and nonlinear projection (P) of these quantizing vectors toward an output space, providing a revealing unfolding of the submanifold. After learning, the network has the ability to continuously map any new point from one space into another: forward mapping of new points in the input space, or backward mapping of an arbitrary position in the output space.
Data Exploration Using SelfOrganizing Maps
 ACTA POLYTECHNICA SCANDINAVICA: MATHEMATICS, COMPUTING AND MANAGEMENT IN ENGINEERING SERIES NO. 82
, 1997
"... Finding structures in vast multidimensional data sets, be they measurement data, statistics, or textual documents, is difficult and timeconsuming. Interesting, novel relations between the data items may be hidden in the data. The selforganizing map (SOM) algorithm of Kohonen can be used to aid the ..."
Abstract

Cited by 107 (4 self)
 Add to MetaCart
Finding structures in vast multidimensional data sets, be they measurement data, statistics, or textual documents, is difficult and timeconsuming. Interesting, novel relations between the data items may be hidden in the data. The selforganizing map (SOM) algorithm of Kohonen can be used to aid the exploration: the structures in the data sets can be illustrated on special map displays. In this work, the methodology of using SOMs for exploratory data analysis or data mining is reviewed and developed further. The properties of the maps are compared with the properties of related methods intended for visualizing highdimensional multivariate data sets. In a set of case studies the SOM algorithm is applied to analyzing electroencephalograms, to illustrating structures of the standard of living in the world, and to organizing fulltext document collections. Measures are proposed for evaluating the quality of different types of maps in representing a given data set, and for measuring the robu...
Quantifying the Neighborhood Preservation of SelfOrganizing Feature Maps
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1992
"... Neighborhood preservation from input space to output space is an essential element of selforganizing feature maps like the Kohonenmap. However, a measure for the preservation or violation of neighborhood relations, which is more systematic than just visual inspection of the map, was lacking. We sho ..."
Abstract

Cited by 90 (3 self)
 Add to MetaCart
Neighborhood preservation from input space to output space is an essential element of selforganizing feature maps like the Kohonenmap. However, a measure for the preservation or violation of neighborhood relations, which is more systematic than just visual inspection of the map, was lacking. We show, that a topographic product P, first introduced in nonlinear dynamics, is an appropriate measure in this regard. It is sensitive to large scale violations of the neighborhood ordering, but does not account for neighborhood ordering distortions due to varying areal magnification factors. A vanishing value of the topographic product indicates a perfect neighborhood preservation, negative (positive) values indicate a too small (too large) output space dimensionality. In a simple example of maps from a 2D input space onto 1D, 2D and 3D output spaces we demonstrate how the topographic product picks the correct output space dimensionality. In a second example we map 19D speech data onto various output spaces and find, that a 3D output space (instead of 2D) seems to be optimally suited to the data. This is in agreement with a recent speech recognition experiment on the same data set.
SensoryMotor Primitives as a Basis for Imitation: Linking Perception to Action and Biology to Robotics
 Imitation in Animals and Artifacts
, 2000
"... ing away from the specific coding of the spinal fields, the examples from neurobiology provide the framework for a motor control system based on a small number of additive primitives (or basis behaviors) sufficient for a rich output movement repertoire. Our previous work (Matari'c 1995, Matari& ..."
Abstract

Cited by 80 (18 self)
 Add to MetaCart
ing away from the specific coding of the spinal fields, the examples from neurobiology provide the framework for a motor control system based on a small number of additive primitives (or basis behaviors) sufficient for a rich output movement repertoire. Our previous work (Matari'c 1995, Matari'c 1997), inspired by the same biological results, has successfully applied the idea of basis behaviors to control of mobile robots 6 by fitting it directly into the modular behaviorbased control paradigm. Applictions of schema theory (Arbib 1992) to behaviorbased mobile robots (Arkin 1987) have employed a similar notion of composable behaviors, stemming from foundations in neuroscience (Arbib 1981, Arbib 1989). The idea of using such primitives for articulator control has been recently studied in robotics. Williamson (1996) and Marjanovi'c, Scassellati & Williamson (1996) developed a 6 DOF (degrees of freedom) robot arm controller. While in the biological and mobile robotics work primitives c...
Fast learning with incremental RBF Networks
 Neural Processing Letters
, 1994
"... We present a new algorithm for the construction of radial basis function (RBF) networks. The method uses accumulated error information to determine where to insert new units. The diameter of the localized units is chosen based on the mutual distances of the units. To have the distance information al ..."
Abstract

Cited by 42 (7 self)
 Add to MetaCart
We present a new algorithm for the construction of radial basis function (RBF) networks. The method uses accumulated error information to determine where to insert new units. The diameter of the localized units is chosen based on the mutual distances of the units. To have the distance information always available, it is held uptodate by a Hebbian learning rule adapted from the "Neural Gas" algorithm. The new method has several advantages over existing methods and is able to generate small, wellgeneralizing networks with comparably few sweeps through the training data. 1 Introduction We propose a new algorithm for the construction of radial basis function (RBF) networks. An RBF network consists of a number of localized units (we use Gaussians in the following) positioned in input vector space. The localized units are completely connected by a set of weighted connections to a set of output units. The output units simply do a summation over their inputs. Networks of this type and vari...
Dynamic selforganizing maps with controlled growth for knowledge discovery
 IEEE Transactions on Neural Networks
, 2000
"... Abstract—The growing selforganizing map (GSOM) has been presented as an extended version of the selforganizing map (SOM), which has significant advantages for knowledge discovery applications. In this paper, the GSOM algorithm is presented in detail and the effect of a spread factor, which can be ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
Abstract—The growing selforganizing map (GSOM) has been presented as an extended version of the selforganizing map (SOM), which has significant advantages for knowledge discovery applications. In this paper, the GSOM algorithm is presented in detail and the effect of a spread factor, which can be used to measure and control the spread of the GSOM, is investigated. The spread factor is independent of the dimensionality of the data and as such can be used as a controlling measure for generating maps with different dimensionality, which can then be compared and analyzed with better accuracy. The spread factor is also presented as a method of achieving hierarchical clustering of a data set with the GSOM. Such hierarchical clustering allows the data analyst to identify significant and interesting clusters at a higher level of the hierarchy, and as such continue with finer clustering of only the interesting clusters. Therefore, only a small map is created in the beginning with a low spread factor, which can be generated for even a very large data set. Further analysis is conducted on selected sections of the data and as such of smaller volume. Therefore, this method facilitates the analysis of even very large data sets. Index Terms—Clustering methods, heirarchical systems, knowledge discovery, neural networks, selforganizing feature maps, unsupervised learning. I.
A SelfOrganising Network That Grows When Required
, 2002
"... The ability to grow extra nodes is a potentially useful facility for a selforganising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the SelfO ..."
Abstract

Cited by 36 (6 self)
 Add to MetaCart
The ability to grow extra nodes is a potentially useful facility for a selforganising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the SelfOrganising Map. In addition, a growing network can deal with dynamic input distributions. Most of the growing networks that have been proposed in the literature add new nodes to support the node that has accumulated the highest error during previous iterations or to support topological structures. This usually means that new nodes are added only when the number of iterations is an integer multiple of some predefined constant,