Results 1  10
of
17
SelfOrganizing Maps: Ordering, Convergence Properties and Energy Functions
 Biological Cybernetics
, 1992
"... We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and F ..."
Abstract

Cited by 100 (2 self)
 Add to MetaCart
We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and Fort to hold in any case where the neighborhood function, which is used to scale the change in the weight values at each neuron, is a monotonically decreasing function of distance from the winner neuron. We prove that the learning dynamics cannot be described by a gradient descent on a single energy function, but may be described using a set of potential functions, one for each neuron, which are independently minimized following a stochastic gradient descent. We derive the correct potential functions for the one and multidimensional case, and show that the energy functions given by Tolat (1990) are an approximation which is no longer valid in the case of highly disordered maps or steep neig...
SelfOrganizing Maps In Natural Language Processing
, 1997
"... Kohonen's SelfOrganizing Map (SOM) is one of the most popular artificial neural network algorithms. Word category maps are SOMs that have been organized according to word similarities, measured by the similarity of the short contexts of the words. Conceptually interrelated words tend to fall i ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
Kohonen's SelfOrganizing Map (SOM) is one of the most popular artificial neural network algorithms. Word category maps are SOMs that have been organized according to word similarities, measured by the similarity of the short contexts of the words. Conceptually interrelated words tend to fall into the same or neighboring map nodes. Nodes may thus be viewed as word categories. Although no a priori information about classes is given, during the selforganizing process a model of the word classes emerges. The central topic of the thesis is the use of the SOM in natural language processing. The approach based on the word category maps is compared with the methods that are widely used in artificial intelligence research. Modeling gradience, conceptual change, and subjectivity of natural language interpretation are considered. The main application area is information retrieval and textual data mining for which a specific SOMbased method called the WEBSOM has been developed. The WEBSOM metho...
Exploration of Text Collections with Hierarchical Feature Maps
, 1997
"... Document classification is one of the central issues in information retrieval research. The aim is to uncover similarities between text documents. In other words, classification techniques are used to gain insight in the structure of the various data items contained in the text archive. In this pape ..."
Abstract

Cited by 38 (14 self)
 Add to MetaCart
Document classification is one of the central issues in information retrieval research. The aim is to uncover similarities between text documents. In other words, classification techniques are used to gain insight in the structure of the various data items contained in the text archive. In this paper we show the results from using a hierarchy of selforganizing maps to perform the text classification task. Each of the individual selforganizing maps is trained independently and gets specialized to a subset of the input data. As a consequence, the choice of this particular artificial neural network model enables the true establishment of a document taxonomy. The benefit of this approach is a straightforward representation of document similarities combined with dramatically reduced training time. In particular, the hierarchical representation of document collections is appealing because it is the underlying organizational principle in use by librarians providing the necessary familiarity...
On the Analysis of Pattern Sequences by SelfOrganizing Maps
, 1994
"... This thesis is organized in three parts. In the first part, the SelfOrganizing Map algorithm is introduced. The discussion focuses on the analysis of the SelfOrganizing Map algorithm. It is shown that the nonlinear nature of the algorithm makes it difficult to analyze the algorithm except in some ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
This thesis is organized in three parts. In the first part, the SelfOrganizing Map algorithm is introduced. The discussion focuses on the analysis of the SelfOrganizing Map algorithm. It is shown that the nonlinear nature of the algorithm makes it difficult to analyze the algorithm except in some trivial cases. In the second part the SelfOrganizing Map algorithm is applied to several patterns sequence analysis tasks. The first application is a voice quality analysis system. It is shown that the SelfOrganizing Map algorithm can be applied to voice analysis by providing the visualization of certain deviations. The key point in the applicability of SelfOrganizing Map algorithm is the topological nature of the mapping; similar voice samples are mapped to nearby locations in the map. The second application is a speech recognition system. Through several experiments it is demonstrated that by collecting some time dependent features and using them in conjunction with the basic SelfOrgan...
Using SelfOrganizing Maps and Learning Vector Quantization for Mixture Density Hidden Markov Models
, 1997
"... This work presents experiments to recognize pattern sequences using hidden Markov models (HMMs). The pattern sequences in the experiments are computed from speech signals and the recognition task is to decode the corresponding phoneme sequences. The training of the HMMs of the phonemes using the col ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
This work presents experiments to recognize pattern sequences using hidden Markov models (HMMs). The pattern sequences in the experiments are computed from speech signals and the recognition task is to decode the corresponding phoneme sequences. The training of the HMMs of the phonemes using the collected speech samples is a difficult task because of the natural variation in the speech. Two neural computing paradigms, the SelfOrganizing Map (SOM) and the Learning Vector Quantization (LVQ) are used in the experiments to improve the recognition performance of the models. A HMM consists of sequential states which are trained to model the feature changes in the signal produced during the modeled process. The output densities applied in this work are mixtures of Gaussian density functions. SOMs are applied to initialize and train the mixtures to give a smooth and faithful presentation of the feature vector space defined by the corresponding training samples. The SOM maps similar feature vect...
Linear Geometric ICA: Fundamentals and Algorithms
, 2003
"... Geometric algorithms for linear independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA was proposed first by Puntonet and Prieto (1995). We will reconsider geometric ICA in ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
Geometric algorithms for linear independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA was proposed first by Puntonet and Prieto (1995). We will reconsider geometric ICA in a theoretic framework showing that fixed points of geometric ICA fulfill a geometric convergence condition (GCC), which the mixed images of the unit vectors satisfy too. This leads to a conjecture claiming that in the nongaussian unimodal symmetric case, there is only one stable fixed point, implying the uniqueness of geometric ICA after convergence. Guided by the principles of ordinary geometric ICA, we then present a new approach to linear geometric ICA based on histograms observing a considerable improvement in separation quality of different distributions and a sizable reduction in computational cost, by a factor of 100, compared to the ordinary geometric approach. Furthermore, we explore the accuracy of the algorithm depending on the number of samples and the choice of the mixing matrix, and compare geometric algorithms with classical ICA algorithms, namely, Extended Infomax and FastICA. Finally, we discuss the problem of highdimensional data sets within the realm of geometrical ICA algorithms.
Organization measures and representations of the Kohonen maps
, 1992
"... . The Kohonen's algorithm is known and used to map automatically an input space with a grid of neurons. When this space is highdimensional, it becomes difficult to analyse the state of the network, because it is no longer possible to represent the network in the weight space. In this paper, we ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
. The Kohonen's algorithm is known and used to map automatically an input space with a grid of neurons. When this space is highdimensional, it becomes difficult to analyse the state of the network, because it is no longer possible to represent the network in the weight space. In this paper, we enumerate some existing techniques used to analyse the network state and we show their limitations. We also present a new method that provides useful information about the organization degree even in highdimensional spaces. Keywords: Kohonen network, highdimensional spaces, organization. 1 Introduction In 1982, T. Kohonen has proposed an original neuronal algorithm which realizes the quantization of an input space that can be continuous or discrete and of arbitrary dimension [4]. Inspired by biological observations on the selforganization (especially on the retinotopic maps formation mechanisms), this algorithm does not intend to modelize all the phenomenons that provide this selforganiza...
SOMICA  an application of selforganizing maps to geometric independent component analysis
 in Proc. of IJCNN 2003
, 2003
"... Abstract — Guided by the principles of geometric independent component analysis (ICA), we present a new approach (SOMICA) to linear geometric ICA using a selforganizing map (SOM). We observe a considerable improvement in separation quality of different distributions, albeit at high computational c ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract — Guided by the principles of geometric independent component analysis (ICA), we present a new approach (SOMICA) to linear geometric ICA using a selforganizing map (SOM). We observe a considerable improvement in separation quality of different distributions, albeit at high computational costs. The SOMICA algorithm is therefore primarily interesting from a theoretical point of view bringing together ICA and SOMs; this intersection could lead to new proofs in geometric ICA based on similar theorems in the SOM theory. I.
A Theoretical Framework for Overcomplete Geometric BMMR
 Proc. of SIP 2002
, 2002
"... Geometric algorithms for linear quadratic independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA has been proposed first by Puntonet and Prieto [15] [17] in order to separa ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Geometric algorithms for linear quadratic independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA has been proposed first by Puntonet and Prieto [15] [17] in order to separate linear mixtures. Recently it has been generalized to overcomplete cases (overcomplete geoICA) with more sources than sensors [21]. Here, we put this algorithm in the twostep framework from [20]. We generalize the geometric theory of quadratic case from [19] to the overcomplete case showing that fixpoints of geometric ICA fulfill a so called geometric convergence condition, which the mixed images of the unit vectors satisfy, too. This leads to a conjecture claiming that in the supergaussian unimodal symmetric case there is only one stable fixpoint, thus demonstrating uniqueness of overcomplete geoICA after convergence.
Document Classification with Unsupervised Artificial Neural Networks
 IN F. CRESTANI, & G. PASI (EDS.), SOFT COMPUTING IN INFORMATION RETRIEVAL (PP. 102–121). WURZBURG (WIEN): PHYSICAVERLAG
, 2000
"... Text collections may be regarded as an almost perfect application arena for unsupervised neural networks. This is because many operations computers have to perform on text documents are classification tasks based on noisy patterns. In particular we rely on selforganizing maps which produce a map of ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Text collections may be regarded as an almost perfect application arena for unsupervised neural networks. This is because many operations computers have to perform on text documents are classification tasks based on noisy patterns. In particular we rely on selforganizing maps which produce a map of the document space after their training process. From geography, however, it is known that maps are not always the best way to represent information spaces. For most applications it is better to provide a hierarchical view of the underlying data collection in form of an atlas where, starting from a map representing the complete data collection, different regions are shown at finer levels of granularity. Using an atlas, the user can easily "zoom" into regions of particular interest while still having general maps for overall orientation. We show that a similar display can be obtained by using hierarchical feature maps to represent the contents of a document archive. These neural networks have layerd architecture where each layer consists of a number of individual selforganizing maps. By this, the contents of the text archive may be represented at arbitrary detail while still having the general maps available for global orientation.