Results 1 
6 of
6
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1492 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Fast and robust fixedpoint algorithms for independent component analysis
 IEEE TRANS. NEURAL NETW
, 1999
"... Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informat ..."
Abstract

Cited by 511 (34 self)
 Add to MetaCart
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, we use a combination of two different approaches for linear ICA: Comon’s informationtheoretic approach and the projection pursuit approach. Using maximum entropy approximations of differential entropy, we introduce a family of new contrast (objective) functions for ICA. These contrast functions enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model, and it is shown how to choose contrast functions that are robust and/or of minimum variance. Finally, we introduce simple fixedpoint algorithms for practical optimization of the contrast functions. These algorithms optimize the contrast functions very fast and reliably.
Independent Component Analysis by Minimization of Mutual Information
, 1997
"... Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, the linear version of the ICA problem is approached from an informationtheoretic ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into components that are statistically as independent from each other as possible. In this paper, the linear version of the ICA problem is approached from an informationtheoretic viewpoint, using Comon's framework of minimizing mutual information of the components. Using maximum entropy approximations of dioeerential entropy, we introduce a family of new contrast (objective) functions for ICA, which can also be considered 1D projection pursuit indexes. The statistical properties of the estimators based on such contrast functions are analyzed under the assumption of the linear mixture model. It is shown how to choose optimal contrast functions according to dioeerent criteria. Novel algorithms for maximizing the contrast functions are then introduced. Hebbianlike learning rules are shown to result from gradient descent methods. Finally, in order to speed up the conv...
On Existence And Uniqueness Of Solutions In NonLinear Independent Component Analysis
 Proceedings of the 1998 IEEE International Joint Conference on Neural Networks (IJCNN’98
, 1998
"... The question of existence and uniqueness of solutions for nonlinear independent component analysis is addressed. It is shown that if the space of mixing functions (processes) is not limited, there exists always an infinity of solutions. In particular, it is shown how to construct parametrized famil ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
The question of existence and uniqueness of solutions for nonlinear independent component analysis is addressed. It is shown that if the space of mixing functions (processes) is not limited, there exists always an infinity of solutions. In particular, it is shown how to construct parametrized families of solutions. The indeterminacies involved are not trivial, as in the linear case. Next, it is shown how to utilize some results of complex analysis to obtain uniqueness of solutions. We show that for two dimensions, the solution is unique up to a rotation, if the mixing function is constrained to be a conformal mapping, together with some other assumptions. We also conjecture that the solution is strictly unique except in some degenerate cases, since the indeterminacy implied by the rotation is essentially similar to solving the linear ICA problem.
Comparison of Adaptive Independent Component Analysis Algorithms
, 1998
"... Independent Component Analysis (ICA) is a recent method for data analysis, based on statistical properties of multidimensional data. Many algorithms performing ICA have appeared in the last years, and there is a need to compare the different methods proposed. In this work five algorithms performing ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Independent Component Analysis (ICA) is a recent method for data analysis, based on statistical properties of multidimensional data. Many algorithms performing ICA have appeared in the last years, and there is a need to compare the different methods proposed. In this work five algorithms performing ICA have been set into a common framework for comparing the results on the same inputs. In the first part, basic theoretical concepts are reviewed and ICA is exposed. Each algorithm is described with emphasis on the common points. In the first part of the experiments, comparisons are made on a simple, artificial blind source separation problem. Parameters of the problem that have a great influence on the process are given a range of different values, and the consequences on the ICA solution are examined. In the second part, ICA is used for obtaining meaningful projections of a multidimensional data set (Projection Pursuit). The quality of the projection is assessed in various ways. This version ...
Fast ICA for Noisy Data using Gaussian Moments
, 1999
"... A novel approach for the problem of estimating the data model of independent component analysis (or blind source separation) in the presence of gaussian noise is introduced. We dene the gaussian moments of a random variable as the expectations of the gaussian function (and some related functions) wi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
A novel approach for the problem of estimating the data model of independent component analysis (or blind source separation) in the presence of gaussian noise is introduced. We dene the gaussian moments of a random variable as the expectations of the gaussian function (and some related functions) with different scale parameters, and show how the gaussian moments of a random variable can be estimated from noisy observations. This enables us to use gaussian moments as oneunit contrast functions that have no asymptotic bias even in the presence of noise, and that are robust against outliers. To implement efficiently the maximization of the contrast functions based on gaussian moments, a modification of our FastICA algorithm is introduced.