Results 1  10
of
105
Curvilinear Component Analysis: A SelfOrganizing Neural Network for Nonlinear Mapping of Data Sets
, 1997
"... We present a new strategy called “curvilinear component analysis” (CCA) for dimensionality reduction and representation of multidimensional data sets. The principle of CCA is a selforganized neural network performing two tasks: vector quantization (VQ) of the submanifold in the data set (input spac ..."
Abstract

Cited by 148 (1 self)
 Add to MetaCart
We present a new strategy called “curvilinear component analysis” (CCA) for dimensionality reduction and representation of multidimensional data sets. The principle of CCA is a selforganized neural network performing two tasks: vector quantization (VQ) of the submanifold in the data set (input space) and nonlinear projection (P) of these quantizing vectors toward an output space, providing a revealing unfolding of the submanifold. After learning, the network has the ability to continuously map any new point from one space into another: forward mapping of new points in the input space, or backward mapping of an arbitrary position in the output space.
Vector Quantization with Complexity Costs
, 1993
"... Vector quantization is a data compression method where a set of data points is encoded by a reduced set of reference vectors, the codebook. We discuss a vector quantization strategy which jointly optimizes distortion errors and the codebook complexity, thereby, determining the size of the codebook. ..."
Abstract

Cited by 54 (18 self)
 Add to MetaCart
Vector quantization is a data compression method where a set of data points is encoded by a reduced set of reference vectors, the codebook. We discuss a vector quantization strategy which jointly optimizes distortion errors and the codebook complexity, thereby, determining the size of the codebook. A maximum entropy estimation of the cost function yields an optimal number of reference vectors, their positions and their assignment probabilities. The dependence of the codebook density on the data density for different complexity functions is investigated in the limit of asymptotic quantization levels. How different complexity measures influence the efficiency of vector quantizers is studied for the task of image compression, i.e., we quantize the wavelet coefficients of gray level images and measure the reconstruction error. Our approach establishes a unifying framework for different quantization methods like Kmeans clustering and its fuzzy version, entropy constrained vector quantizati...
Controling the Magnification Factor of SelfOrganizing Feature Maps
, 1995
"... The magnification exponents ¯ occuring in adaptive map formation algorithms like Kohonen's selforganizing feature map deviate for the information theoretically optimal value ¯ = 1 as well as from the values which optimize, e.g., the mean square distortion error (¯ = 1=3 for onedimensional maps). A ..."
Abstract

Cited by 41 (7 self)
 Add to MetaCart
The magnification exponents ¯ occuring in adaptive map formation algorithms like Kohonen's selforganizing feature map deviate for the information theoretically optimal value ¯ = 1 as well as from the values which optimize, e.g., the mean square distortion error (¯ = 1=3 for onedimensional maps). At the same time, models for categorical perception such as the "perceptual magnet" effect which are based on topographic maps require negative magnification exponents ¯ ! 0. We present an extension of the selforganizing feature map algorithm which utilizes adaptive local learning step sizes to actually control the magnification properties of the map. By change of a single parameter, maps with optimal information transfer, with various minimal reconstruction errors, or with an inverted magnification can be generated. Analytic results on this new algorithm are complemented by numerical simulations. 1. Introduction The representation of information in topographic maps is a common property of...
A Theory of Proximity Based Clustering: Structure Detection by Optimization
 Pattern Recognition
, 1999
"... In this paper, a systematic optimization approach for clustering proximity or similarity data is developed. Starting from fundamental invariance and robustness properties, a set of axioms is proposed and discussed to distinguish different cluster compactness and separation criteria. The approach cov ..."
Abstract

Cited by 34 (8 self)
 Add to MetaCart
In this paper, a systematic optimization approach for clustering proximity or similarity data is developed. Starting from fundamental invariance and robustness properties, a set of axioms is proposed and discussed to distinguish different cluster compactness and separation criteria. The approach covers the case of sparse proximity matrices, and is extended to nested partitionings for hierarchical data clustering. To solve the associated optimization problems, a rigorous mathematical framework for deterministic annealing and meanfield approximation is presented. Efficient optimization heuristics are derived in a canonical way, which also clarifies the relation to stochastic optimization by Gibbs sampling. Similaritybased clustering techniques have a broad range of possible applications in computer vision, pattern recognition, and data analysis. As a major practical application we present a novel approach to the problem of unsupervised texture segmentation, which relies on statistical...
Neural network approaches to image compression
 Proc. IEEE
, 1995
"... Abstract — This paper presents a tutorial overview of neural networks as signal processing tools for image compression. They are well suited to the problem of image compression due to their massively parallel and distributed architecture. Their characteristics are analogous to some of the features o ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
Abstract — This paper presents a tutorial overview of neural networks as signal processing tools for image compression. They are well suited to the problem of image compression due to their massively parallel and distributed architecture. Their characteristics are analogous to some of the features of our own visual system, which allow us to process visual information with much ease. For example, multilayer perceptrons can be used as nonlinear predictors in differential pulsecode modulation (DPCM). Such predictors have been shown to increase the predictive gain relative to a linear predictor. Another active area of research is in the application of Hebbian learning to the extraction of principal components, which are the basis vectors for the optimal linear KarhunenLoève transform (KLT). These learning algorithms are iterative, have some computational advantages over standard eigendecomposition techniques, and can be made to adapt to changes in the input signal. Yet another model, the selforganizing feature map (SOFM), has been used with a great deal of success in the design of codebooks for vector quantization (VQ). The resulting codebooks are less sensitive to initial conditions than the standard LBG algorithm, and the topological ordering of the entries can be exploited to further increase coding efficiency and reduce computational complexity. I.
Curvilinear Distance Analysis versus Isomap
 Proceedings of ESANN’2002, 10th European Symposium on Artificial Neural Networks
, 2000
"... Dimension reduction techniques are widely used for the analysis and visualization of complex sets of data. This paper compares two nonlinear projection methods: Isomap and Curvilinear Distance Analysis. ..."
Abstract

Cited by 25 (11 self)
 Add to MetaCart
Dimension reduction techniques are widely used for the analysis and visualization of complex sets of data. This paper compares two nonlinear projection methods: Isomap and Curvilinear Distance Analysis.
A multilayer selforganizing feature map for range image segmentation
 Neurul Network 8( 1) (I 995) 6786. [ 151 T. Kohonen, SeljXIrgunitution und Associurive Memory, 2nd Edition (SpringerVerlag
, 1988
"... AbstractThis paper proposes and describes a hierarchical selforganizing neural network for range image segmentation. The multilayer selforganizing feature map (MLSOFM), which is an extension of the traditional (singlelayer) selforganizing feature map ( SOFM) is seen to alleviate the shortcoming ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
AbstractThis paper proposes and describes a hierarchical selforganizing neural network for range image segmentation. The multilayer selforganizing feature map (MLSOFM), which is an extension of the traditional (singlelayer) selforganizing feature map ( SOFM) is seen to alleviate the shortcomings of the latter in the context of range image segmentation. The problem of range image segmentation is formulated as one of vector quantization and is mapped onto the MLSOFM. The MLSOFM combines the ideas of selforganization and topographic mapping with those ofmultiscale image segmentation. Experimental results using real range images are presented. KeywordsRange image segmentation, Selforganizing feature map, Neural networks, Computer vision. 1.
Local Dynamic Modeling with SelfOrganizing Maps and Applications to Nonlinear System Identification and Control
 Proceedings of the IEEE
, 1998
"... The technique of local linear models is appealing for modeling complex time series due to the weak assumptions required and its intrinsic simplicity. Here, instead of deriving the local models from the data, we propose to estimate them directly from the weights of a self organizing map (SOM), which ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
The technique of local linear models is appealing for modeling complex time series due to the weak assumptions required and its intrinsic simplicity. Here, instead of deriving the local models from the data, we propose to estimate them directly from the weights of a self organizing map (SOM), which functions as a dynamicpreserving model of the dynamics. We introduce one modification to the Kohonen learning to ensure good representation of the dynamics and use weighted least squares to ensure continuity among the local models. The proposed scheme is tested using synthetic chaotic time series and real world data. The practicality of the method is illustrated in the identification and control of the NASA Langley wind tunnel during aerodynamic tests of model aircrafts. Modeling the dynamics with a SOM leads to a predictive multiple model control strategy (PMMC). Comparison of the new controller against the existing controller in test runs shows the superiority of our method. 1. Introducti...
Competitive Learning Algorithms for Robust Vector Quantization
, 1998
"... The efficient representation and encoding of signals with limited resources, e.g., finite storage capacity and restricted transmission bandwidth, is a fundamental problem in technical as well as biological information processing systems. Typically, under realistic circumstances, the encoding and com ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
The efficient representation and encoding of signals with limited resources, e.g., finite storage capacity and restricted transmission bandwidth, is a fundamental problem in technical as well as biological information processing systems. Typically, under realistic circumstances, the encoding and communication of messages has to deal with different sources of noise and disturbances. In this paper, we propose a unifying approach to data compression by robust vector quantization, which explicitly deals with channel noise, bandwidth limitations, and random elimination of prototypes. The resulting algorithm is able to limit the detrimental effect of noise in a very general communication scenario. In addition, the presented model allows us to derive a novel competitive neural networks algorithm, which covers topology preserving feature maps, the socalled neuralgas algorithm, and the maximum entropy softmax rule as special cases. Furthermore, continuation methods based on these noise models impr...
On scaling up balanced clustering algorithms
 In Proceedings of the SIAM International Conference on Data Mining
, 2002
"... "rand01 " 2003/4/14 10:12 page 1 #1 i i i i ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
"rand01 " 2003/4/14 10:12 page 1 #1 i i i i