Results 1  10
of
76
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1507 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Task Decomposition Through Competition in a Modular Connectionist Architecture
 COGNITIVE SCIENCE
, 1990
"... A novel modular connectionist architecture is presented in which the networks composing the architecture compete to learn the training patterns. As a result of the competition, different networks learn different training patterns and, thus, learn to compute different functions. The architecture pe ..."
Abstract

Cited by 180 (5 self)
 Add to MetaCart
A novel modular connectionist architecture is presented in which the networks composing the architecture compete to learn the training patterns. As a result of the competition, different networks learn different training patterns and, thus, learn to compute different functions. The architecture performs task decomposition in the sense that it learns to partition a task into two or more functionally independent vii tasks and allocates distinct networks to learn each task. In addition, the architecture tends to allocate to each task the network whose topology is most appropriate to that task, and tends to allocate the same network to similar tasks and distinct networks to dissimilar tasks. Furthermore, it can be easily modified so as to...
The neural basis of cognitive development: A constructivist manifesto
 Behavioral and Brain Sciences
, 1997
"... Quartz, S. & Sejnowski, T.J. (1997). The neural basis of cognitive development: A constructivist manifesto. ..."
Abstract

Cited by 128 (2 self)
 Add to MetaCart
Quartz, S. & Sejnowski, T.J. (1997). The neural basis of cognitive development: A constructivist manifesto.
SelfOrganizing Maps: Ordering, Convergence Properties and Energy Functions
 Biological Cybernetics
, 1992
"... We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and F ..."
Abstract

Cited by 100 (2 self)
 Add to MetaCart
We investigate the convergence properties of the selforganizing feature map algorithm for a simple, but very instructive case: the formation of a topographic representation of the unit interval [0; 1] by a linear chain of neurons. We extend the proofs of convergence of Kohonen and of Cottrell and Fort to hold in any case where the neighborhood function, which is used to scale the change in the weight values at each neuron, is a monotonically decreasing function of distance from the winner neuron. We prove that the learning dynamics cannot be described by a gradient descent on a single energy function, but may be described using a set of potential functions, one for each neuron, which are independently minimized following a stochastic gradient descent. We derive the correct potential functions for the one and multidimensional case, and show that the energy functions given by Tolat (1990) are an approximation which is no longer valid in the case of highly disordered maps or steep neig...
Quantifying the Neighborhood Preservation of SelfOrganizing Feature Maps
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1992
"... Neighborhood preservation from input space to output space is an essential element of selforganizing feature maps like the Kohonenmap. However, a measure for the preservation or violation of neighborhood relations, which is more systematic than just visual inspection of the map, was lacking. We sho ..."
Abstract

Cited by 82 (3 self)
 Add to MetaCart
Neighborhood preservation from input space to output space is an essential element of selforganizing feature maps like the Kohonenmap. However, a measure for the preservation or violation of neighborhood relations, which is more systematic than just visual inspection of the map, was lacking. We show, that a topographic product P, first introduced in nonlinear dynamics, is an appropriate measure in this regard. It is sensitive to large scale violations of the neighborhood ordering, but does not account for neighborhood ordering distortions due to varying areal magnification factors. A vanishing value of the topographic product indicates a perfect neighborhood preservation, negative (positive) values indicate a too small (too large) output space dimensionality. In a simple example of maps from a 2D input space onto 1D, 2D and 3D output spaces we demonstrate how the topographic product picks the correct output space dimensionality. In a second example we map 19D speech data onto various output spaces and find, that a 3D output space (instead of 2D) seems to be optimally suited to the data. This is in agreement with a recent speech recognition experiment on the same data set.
A TwoLayer Sparse Coding Model Learns Simple and Complex Cell Receptive Fields and Topography From Natural Images
 VISION RESEARCH
, 2001
"... The classical receptive fields of simple cells in the visual cortex have been shown to emerge from the statistical properties of natural images by forcing the cell responses to be maximally sparse, i.e. significantly activated only rarely. Here, we show that this single principle of sparseness can ..."
Abstract

Cited by 69 (18 self)
 Add to MetaCart
The classical receptive fields of simple cells in the visual cortex have been shown to emerge from the statistical properties of natural images by forcing the cell responses to be maximally sparse, i.e. significantly activated only rarely. Here, we show that this single principle of sparseness can also lead to emergence of topography (columnar organization) and complex cell properties as well. These are obtained by maximizing the sparsenesses of locally pooled energies, which correspond to complex cell outputs. Thus we obtain a highly parsimonious model of how these properties of the visual cortex are adapted to the characteristics of the natural input.
Functional microorganization of primary visual cortex: Receptive field analysis of nearby neurons
 Journal of Neuroscience
, 1999
"... It is well established that multiple stimulus dimensions (e.g., orientation and spatial frequency) are mapped onto the surface of striate cortex. However, the detailed organization of neurons within a local region of striate cortex remains unclear. Within a vertical column, do all neurons have the s ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
It is well established that multiple stimulus dimensions (e.g., orientation and spatial frequency) are mapped onto the surface of striate cortex. However, the detailed organization of neurons within a local region of striate cortex remains unclear. Within a vertical column, do all neurons have the same response selectivities? And if not, how do they most commonly differ and why? To address these questions, we recorded from nearby pairs of simple cells and made detailed spatiotemporal maps of their receptive fields. From these maps, we extracted and analyzed a variety of response metrics. Our results provide new insights into the local organization of striate cortex. First, we show that nearby neurons seldom have very similar receptive fields, when these fields are characterized in space and time. Thus, there may be less redundancy within a column than previously thought. Moreover, we show that correlated discharge inColumnar organization is a common feature of cortical architecture (Mountcastle, 1997). Neurons along a path perpendicular to the cortical surface often have similar functional properties, and these properties often vary systematically across the surface of the cortex. In primary visual (or striate) cortex, systems of columns are well documented for orientation preference, ocular dominance, and retinotopic location (Hubel and Wiesel, 1977). Preferred spatial frequency also has an orderly representation, although the details of this organization remain controversial (Maffei and Fiorentini, 1977; Tootell et al., 1981; Berardi et al.,
Bubbles: A Unifying Framework for LowLevel Statistical Properties of Natural Image Sequences
, 2003
"... This paper proposes a unifying framework for several models of the statistical structure of natural image sequences. The framework combines three properties: sparseness, temporal coherence, and energy correlations; these will be reviewed below. It leads to models where the joint activation of the li ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
This paper proposes a unifying framework for several models of the statistical structure of natural image sequences. The framework combines three properties: sparseness, temporal coherence, and energy correlations; these will be reviewed below. It leads to models where the joint activation of the linear filters (simple cells) takes the form of "bubbles," which are regions of activity that are localized both in time and in space, space meaning the cortical surface or a grid on which the filters are arranged. The paper is organized as follows. First, we discuss the principal statistical properties of natural images investigated so far, and we examine how these can be used in the estimation of a linear image model (Section 2). Then we show how sparseness and temporal coherence can be combined in a single model, which is based on the concept of temporal bubbles, and attempt to demonstrate that this gives a better model of the outputs of Gaborlike linear filters than either of the criteria alone (Section 3). We extend the model to include topography as well, leading to the intuitive notion of spatiotemporal bubbles (Section 4). We also discuss the extensions of the framework to spatiotemporal receptive fields (Section 5). Finally, we discuss the utility of our model and its relation to other models (Section 6)
A Unifying Objective Function for Topographic Mappings
, 1997
"... Many different algorithms and objective functions for topographic mappings have been proposed. We show that several of these approaches can be seen as particular cases of a more general objective function. Consideration of a very simple mapping problem reveals large differences in the form of the ma ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
Many different algorithms and objective functions for topographic mappings have been proposed. We show that several of these approaches can be seen as particular cases of a more general objective function. Consideration of a very simple mapping problem reveals large differences in the form of the map that each particular case favors. These differences have important consequences for the practical application of topographic mapping methods.
Some Extensions of the KMeans Algorithm for Image Segmentation and Pattern Classification
 Technical Report, MIT Artificial Intelligence Laboratory
, 1993
"... In this paper we present some extensions to the kmeans algorithm for vector quantization that permit its efficient use in image segmentation and pattern classification tasks. It is shown that by introducing state variables that correspond to certain statistics of the dynamic behavior of the algorit ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
In this paper we present some extensions to the kmeans algorithm for vector quantization that permit its efficient use in image segmentation and pattern classification tasks. It is shown that by introducing state variables that correspond to certain statistics of the dynamic behavior of the algorithm, it is possible to find the representative centers of the lower dimensional manifolds that define the boundaries between classes, for clouds of multidimensional, multiclass data; this permits one, for example, to find class boundaries directly from sparse data (e.g., in image segmentation tasks) or to efficiently place centers for pattern classification (e.g., with local Gaussian classifiers). The same state variables can be used to define algorithms for determining adaptively the optimal number of centers for clouds of data with spacevarying density. Some examples of the application of these extensions are also given. Copyright c fl Massachusetts Institute of Technology, 1993 This rep...