Results 1  10
of
35
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1488 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Blind Signal Separation: Statistical Principles
, 2003
"... Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual i ..."
Abstract

Cited by 388 (4 self)
 Add to MetaCart
Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual independence between the signals. The weakness of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem, to show how they stem from basic principles and how they relate to each other.
Multichannel Blind Deconvolution: Fir Matrix Algebra And Separation Of Multipath Mixtures
, 1996
"... A general tool for multichannel and multipath problems is given in FIR matrix algebra. With Finite Impulse Response (FIR) filters (or polynomials) assuming the role played by complex scalars in traditional matrix algebra, we adapt standard eigenvalue routines, factorizations, decompositions, and mat ..."
Abstract

Cited by 73 (0 self)
 Add to MetaCart
A general tool for multichannel and multipath problems is given in FIR matrix algebra. With Finite Impulse Response (FIR) filters (or polynomials) assuming the role played by complex scalars in traditional matrix algebra, we adapt standard eigenvalue routines, factorizations, decompositions, and matrix algorithms for use in multichannel /multipath problems. Using abstract algebra/group theoretic concepts, information theoretic principles, and the Bussgang property, methods of single channel filtering and source separation of multipath mixtures are merged into a general FIR matrix framework. Techniques developed for equalization may be applied to source separation and vice versa. Potential applications of these results lie in neural networks with feedforward memory connections, wideband array processing, and in problems with a multiinput, multioutput network having channels between each source and sensor, such as source separation. Particular applications of FIR polynomial matrix alg...
Independent Component Analysis by General Nonlinear Hebbianlike Learning Rules
 Signal Processing
, 1998
"... A number of neural learning rules have been recently proposed... In this paper, we show that in fact, ICA can be performed by very simple Hebbian or antiHebbian learning rules, which may have only weak relations to such informationtheoretical quantities. Rather suprisingly, practically any nonlin ..."
Abstract

Cited by 56 (11 self)
 Add to MetaCart
A number of neural learning rules have been recently proposed... In this paper, we show that in fact, ICA can be performed by very simple Hebbian or antiHebbian learning rules, which may have only weak relations to such informationtheoretical quantities. Rather suprisingly, practically any nonlinear function can be used in the learning rule, provided only that the sign of the Hebbian/antiHebbian term is chosen correctly. In addition to the Hebbianlike mechanism, the weight vector is here constrained to have unit norm, and the data is preprocessed by prewhitening, or sphering. These results imply that one can choose the nonlinearity so as to optimize desired statistical or numerical criteria.
Free deconvolution for signal processing applications
 IEEE Trans. Inform. Theory
"... Abstract—Situations in many fields of research, such as digital communications, nuclear physics and mathematical finance, can be modelled with random matrices. When the matrices get large, free probability theory is an invaluable tool for describing the asymptotic behaviour of many systems. It will ..."
Abstract

Cited by 23 (14 self)
 Add to MetaCart
Abstract—Situations in many fields of research, such as digital communications, nuclear physics and mathematical finance, can be modelled with random matrices. When the matrices get large, free probability theory is an invaluable tool for describing the asymptotic behaviour of many systems. It will be explained how free probability can be used to estimate covariance matrices. Multiplicative free deconvolution is shown to be a method which can aid in expressing limit eigenvalue distributions for sample covariance matrices, and to simplify estimators for eigenvalue distributions of covariance matrices. Index Terms—Free Probability Theory, Random Matrices, deconvolution, limiting eigenvalue distribution, Ganalysis.
Simple Neuron Models for Independent Component Analysis
 Int. Journal of Neural Systems
, 1997
"... Recently, several neural algorithms have been introduced for Independent Component Analysis. Here we approach the problem from the point of view of a single neuron. First, simple Hebbianlike learning rules are introduced for estimating one of the independent components from sphered data. Some of th ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Recently, several neural algorithms have been introduced for Independent Component Analysis. Here we approach the problem from the point of view of a single neuron. First, simple Hebbianlike learning rules are introduced for estimating one of the independent components from sphered data. Some of the learning rules can be used to estimate an independent component which has a negative kurtosis, and the others estimate a component of positive kurtosis. Next, a twounit system is introduced to estimate an independent component of any kurtosis. The results are then generalized to estimate independent components from nonsphered (raw) mixtures. To separate several independent components, a system of several neurons with linear negative feedback is used. The convergence of the learning rules is rigorously proven without any unnecessary hypotheses on the distributions of the independent components.
Independent component analysis based on nonparametric density estimation
 IEEE Trans. Neural Netw
, 2004
"... Abstract—In this paper, we introduce a novel independent component analysis (ICA) algorithm, which is truly blind to the particular underlying distribution of the mixed signals. Using a nonparametric kernel density estimation technique, the algorithm performs simultaneously the estimation of the unk ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Abstract—In this paper, we introduce a novel independent component analysis (ICA) algorithm, which is truly blind to the particular underlying distribution of the mixed signals. Using a nonparametric kernel density estimation technique, the algorithm performs simultaneously the estimation of the unknown probability density functions of the source signals and the estimation of the unmixing matrix. Following the proposed approach, the blind signal separation framework can be posed as a nonlinear optimization problem, where a closed form expression of the cost function is available, and only the elements of the unmixing matrix appear as unknowns. We conducted a series of Monte Carlo simulations, involving linear mixtures of various source signals with different statistical characteristics and sample sizes. The new algorithm not only consistently outperformed all stateoftheart ICA methods, but also demonstrated the following properties: 1) Only a flexible model, capable of learning the source statistics, can consistently achieve an accurate separation of all the mixed signals. 2) Adopting a suitably designed optimization framework, it is possible to derive a flexible ICA algorithm that matches the stability and convergence properties of conventional algorithms. 3) A nonparametric approach does not necessarily require large sample sizes in order to outperform methods with fixed or partially adaptive contrast functions. Index Terms—Independent component analysis (ICA), kernel density estimation, nonlinear optimization, nonparametric methods. I.
Blind System Identification
, 1997
"... Blind system identification is a fundamental signal processing technology aimed to retrieve unknown information of a system from its output only. This technology has a wide range of possible applications such as mobile communications, speech reverberation cancellation and blind image restoration. Th ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
Blind system identification is a fundamental signal processing technology aimed to retrieve unknown information of a system from its output only. This technology has a wide range of possible applications such as mobile communications, speech reverberation cancellation and blind image restoration. This paper reviews a number of recently developed concepts and techniques for blind system identification which include the concept of blind system identifiability in a deterministic framework, the blind techniques of maximum likelihood and subspace for estimating the system's impulse response, and other techniques for direct estimation of the system input. Keywords: System identification, Blind techniques, Multichannels, Equalization, Source separation. This work has been supported by the Australian Research Council and the Australian Cooperative Research Center for Sensor Signal and Information Processing. y Currently with Motorola Australian Research Centre, 12 Lord Street, Botany 2019, ...
Blind Carrier Frequency Offset Estimation in SISO, MIMO and Multiuser OFDM Systems
"... Relying on a kurtosistype criterion, we develop a lowcomplexity blind carrier frequency offset (CFO) estimator for orthogonal frequencydivision multiplexing (OFDM) systems. We demonstrate analytically how identifiability and performance of this blind CFO estimator depend on the channel's frequenc ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Relying on a kurtosistype criterion, we develop a lowcomplexity blind carrier frequency offset (CFO) estimator for orthogonal frequencydivision multiplexing (OFDM) systems. We demonstrate analytically how identifiability and performance of this blind CFO estimator depend on the channel's frequency selectivity and the input distribution. We show that this approach can be applied to blind CFO estimation in multiinputmultioutput (MIMO), and multiuser OFDM systems. The issues of channel nulls, multiuser interference and effects of multiple antennas are addressed analytically, and tested via simulations.
SimulationBased Methods for Blind MaximumLikelihood Filter Identification
, 1999
"... Blind linear system identication consists in estimating the parameters of a linear timeinvariant ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
Blind linear system identication consists in estimating the parameters of a linear timeinvariant