Results 1  10
of
46
An informationmaximization approach to blind separation and blind deconvolution
 NEURAL COMPUTATION
, 1995
"... ..."
Learning Distance Functions Using Equivalence Relations
 In Proceedings of the Twentieth International Conference on Machine Learning
, 2003
"... We address the problem of learning distance metrics using sideinformation in the form of groups of "similar" points. We propose to use the RCA algorithm, which is a simple and e#cient algorithm for learning a full ranked Mahalanobis metric (Shental et al., 2002). ..."
Abstract

Cited by 137 (5 self)
 Add to MetaCart
We address the problem of learning distance metrics using sideinformation in the form of groups of "similar" points. We propose to use the RCA algorithm, which is a simple and e#cient algorithm for learning a full ranked Mahalanobis metric (Shental et al., 2002).
Multichannel Blind Deconvolution: Fir Matrix Algebra And Separation Of Multipath Mixtures
, 1996
"... A general tool for multichannel and multipath problems is given in FIR matrix algebra. With Finite Impulse Response (FIR) filters (or polynomials) assuming the role played by complex scalars in traditional matrix algebra, we adapt standard eigenvalue routines, factorizations, decompositions, and mat ..."
Abstract

Cited by 74 (0 self)
 Add to MetaCart
A general tool for multichannel and multipath problems is given in FIR matrix algebra. With Finite Impulse Response (FIR) filters (or polynomials) assuming the role played by complex scalars in traditional matrix algebra, we adapt standard eigenvalue routines, factorizations, decompositions, and matrix algorithms for use in multichannel /multipath problems. Using abstract algebra/group theoretic concepts, information theoretic principles, and the Bussgang property, methods of single channel filtering and source separation of multipath mixtures are merged into a general FIR matrix framework. Techniques developed for equalization may be applied to source separation and vice versa. Potential applications of these results lie in neural networks with feedforward memory connections, wideband array processing, and in problems with a multiinput, multioutput network having channels between each source and sensor, such as source separation. Particular applications of FIR polynomial matrix alg...
Temporal Decorrelation: A Theory of Lagged and Nonlagged Responses in the Lateral Geniculate Nucleus
 Network
, 1995
"... Natural timevarying images possess significant temporal correlations when sampled frame by frame by the photoreceptors. These correlations persist even after retinal processing and hence, under natural activation conditions, the signal sent to the lateral geniculate nucleus is temporally redundant ..."
Abstract

Cited by 44 (0 self)
 Add to MetaCart
Natural timevarying images possess significant temporal correlations when sampled frame by frame by the photoreceptors. These correlations persist even after retinal processing and hence, under natural activation conditions, the signal sent to the lateral geniculate nucleus is temporally redundant or inefficient. We explore the hypothesis that the LGN is concerned, among other things, with improving efficiency of visual representation through active temporal decorrelation of the retinal signal much in the same way that the retina improves efficiency by spatially decorrelating incoming images. Using some recently measured statistical properties of timevarying images, we predict the spatiotemporal receptive fields that achieve this decorrelation. It is shown that, because of neuronal nonlinearities, temporal decorrelation requires two response types, the lagged and nonlagged, just as spatial decorrelation requires on and off response types. The tuning and response properties of the p...
Computational genetics, physiology, metabolism, neural systems, learning, vision, and behavior or PolyWorld: Life in a new context
 Artificial Life III, Vol. XVII of SFI Studies in the Sciences of Complexity, Santa Fe Institute
, 1993
"... This paper discusses a computer model of living organisms and the ecology they exist in called PolyWorld. PolyWorld attempts to bring together all the principle components of real living systems into a single artificial (manmade) living system. PolyWorld brings together biologically motivated genet ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
This paper discusses a computer model of living organisms and the ecology they exist in called PolyWorld. PolyWorld attempts to bring together all the principle components of real living systems into a single artificial (manmade) living system. PolyWorld brings together biologically motivated genetics, simple simulated physiologies and metabolisms, Hebbian learning in arbitrary neural network architectures, a visual perceptive mechanism, and a suite of primitive behaviors in artificial organisms grounded in an ecology just complex enough to foster speciation and interspecies competition. Predation, mimicry, sexual reproduction, and even communication are all supported in a straightforward fashion. The resulting survival strategies, both individual and group, are purely emergent, as are the functionalities embodied in their neural network "brains". Complex behaviors resulting from the simulated neural activity are unpredictable, and change as natural selection acts over multiple generations. In many ways, PolyWorld may be thought of as a sort of electronic primordial soup experiment, in the vein of Urey and Miller's [33] classic experiment, only commencing at a much higher level of organization. While one could claim that Urey and Miller really just threw a bunch of ingredients in a pot and watched to see what happened, the reason these men made a contribution to science rather than ratatouille is that they put the right ingredients in the right pot ... and watched to see what happened. Here we start with softwarecoded genetics and various simple nerve cells (lightsensitive, motor, and unspecified neuronal) as the ingredients, and place them in a competitive ecological crucible which subjects them to an internally consistent physics and the process of natural selectio...
Survey of Sparse and NonSparse Methods in Source Separation
, 2005
"... Source separation arises in a variety of signal processing applications, ranging from speech processing to medical image analysis. The separation of a superposition of multiple signals is accomplished by taking into account the structure of the mixing process and by making assumptions about the sour ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
Source separation arises in a variety of signal processing applications, ranging from speech processing to medical image analysis. The separation of a superposition of multiple signals is accomplished by taking into account the structure of the mixing process and by making assumptions about the sources. When the information about the mixing process and sources is limited, the problem is called ‘blind’. By assuming that the sources can be represented sparsely in a given basis, recent research has demonstrated that solutions to previously problematic blind source separation problems can be obtained. In some cases, solutions are possible to problems intractable by previous nonsparse methods. Indeed, sparse methods provide a powerful approach to the separation of linear mixtures of independent data. This paper surveys the recent arrival of sparse blind source separation methods and the previously existing nonsparse methods, providing insights and appropriate hooks into the literature along the way.
A Novel Measure for Independent Component Analysis (ICA)
"... Measures of independence (and dependence) are fundamental in many areas of engineering and signal processing. Shannon introduced the idea of Information Entropy which has a sound theoretical foundation but sometimes is not easy to implement in engineering applications. In this paper, Renyi's Entropy ..."
Abstract

Cited by 24 (10 self)
 Add to MetaCart
Measures of independence (and dependence) are fundamental in many areas of engineering and signal processing. Shannon introduced the idea of Information Entropy which has a sound theoretical foundation but sometimes is not easy to implement in engineering applications. In this paper, Renyi's Entropy is used and a novel independence measure is proposed. When integrated with a nonparametric estimator of the probability density function (Parzen Window), the measure can be related to the "potential energy of the samples" which is easy to understand and implement. The experimental results on Blind Source Separation confirm the theory. Although the work is preliminary, the "potential energy" method is rather general and will have many applications. 1. INTRODUCTION Information theory is a powerful tool in communication, signal processing, and even machine learning. The parallel between information and energy is well known and here their measures will be also linked. This paper shows that our...
A Geometric Algorithm for Overcomplete Linear ICA
 NEUROCOMPUTING
, 2003
"... Geometric algorithms for linear quadratic independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA has been proposed first by Puntonet and Prieto [1] [2] in order to separate ..."
Abstract

Cited by 23 (11 self)
 Add to MetaCart
Geometric algorithms for linear quadratic independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA has been proposed first by Puntonet and Prieto [1] [2] in order to separate linear mixtures. We generalize these algorithms to overcomplete cases with more sources than sensors. With geometric ICA we get an efficient method for the matrixrecovery step in the framework of a twostep approach to the source separation problem. The second step  sourcerecovery  uses a maximumlikelihood approach. There we prove that the shortestpath algorithm as proposed by Bofill and Zibulevsky in [3] indeed solves the maximumlikelihood conditions.
Linear Geometric ICA: Fundamentals and Algorithms
, 2003
"... Geometric algorithms for linear independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA was proposed first by Puntonet and Prieto (1995). We will reconsider geometric ICA in ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
Geometric algorithms for linear independent component analysis (ICA) have recently received some attention due to their pictorial description and their relative ease of implementation. The geometric approach to ICA was proposed first by Puntonet and Prieto (1995). We will reconsider geometric ICA in a theoretic framework showing that fixed points of geometric ICA fulfill a geometric convergence condition (GCC), which the mixed images of the unit vectors satisfy too. This leads to a conjecture claiming that in the nongaussian unimodal symmetric case, there is only one stable fixed point, implying the uniqueness of geometric ICA after convergence. Guided by the principles of ordinary geometric ICA, we then present a new approach to linear geometric ICA based on histograms observing a considerable improvement in separation quality of different distributions and a sizable reduction in computational cost, by a factor of 100, compared to the ordinary geometric approach. Furthermore, we explore the accuracy of the algorithm depending on the number of samples and the choice of the mixing matrix, and compare geometric algorithms with classical ICA algorithms, namely, Extended Infomax and FastICA. Finally, we discuss the problem of highdimensional data sets within the realm of geometrical ICA algorithms.