Results 1  10
of
43
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1488 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Blind Beamforming for Non Gaussian Signals
 IEE ProceedingsF
, 1993
"... This paper considers an application of blind identification to beamforming. The key point is to use estimates of directional vectors rather than resorting to their hypothesized value. By using estimates of the directional vectors obtained via blind identification i.e. without knowing the arrray mani ..."
Abstract

Cited by 490 (31 self)
 Add to MetaCart
This paper considers an application of blind identification to beamforming. The key point is to use estimates of directional vectors rather than resorting to their hypothesized value. By using estimates of the directional vectors obtained via blind identification i.e. without knowing the arrray manifold, beamforming is made robust with respect to array deformations, distortion of the wave front, pointing errors, etc ... so that neither array calibration nor physical modeling are necessary. Rather surprisingly, `blind beamformers' may outperform `informed beamformers' in a plausible range of parameters, even when the array is perfectly known to the informed beamformer. The key assumption blind identification relies on is the statistical independence of the sources, which we exploit using fourthorder cumulants. A computationally efficient technique is presented for the blind estimation of directional vectors, based on joint diagonalization of 4thorder cumulant matrices
A fast fixedpoint algorithm for independent component analysis
 Neural Computation
, 1997
"... Abstract. Independent Subspace Analysis (ISA; Hyvarinen & Hoyer, 2000) is an extension of ICA. In ISA, the components are divided into subspaces and components in different subspaces are assumed independent, whereas components in the same subspace have dependencies.In this paper we describe a fixed ..."
Abstract

Cited by 428 (19 self)
 Add to MetaCart
Abstract. Independent Subspace Analysis (ISA; Hyvarinen & Hoyer, 2000) is an extension of ICA. In ISA, the components are divided into subspaces and components in different subspaces are assumed independent, whereas components in the same subspace have dependencies.In this paper we describe a fixedpoint algorithm for ISA estimation, formulated in analogy to FastICA. In particular we give a proof of the quadratic convergence of the algorithm, and present simulations that confirm the fast convergence, but also show that the method is prone to convergence to local minima. 1
Equivariant Adaptive Source Separation
 IEEE Trans. on Signal Processing
, 1996
"... Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI (Eq ..."
Abstract

Cited by 378 (10 self)
 Add to MetaCart
Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI (Equivariant Adaptive Separation via Independence) . The EASI algorithms are based on the idea of serial updating: this specific form of matrix updates systematically yields algorithms with a simple, parallelizable structure, for both real and complex mixtures. Most importantly, the performance of an EASI algorithm does not depend on the mixing matrix. In particular, convergence rates, stability conditions and interference rejection levels depend only on the (normalized) distributions of the source signals. Close form expressions of these quantities are given via an asymptotic performance analysis. This is completed by some numerical experiments illustrating the effectiveness of the proposed ap...
Blind Separation of Mixture of Independent Sources Through a Maximum Likelihood Approach
 In Proc. EUSIPCO
, 1997
"... In this paper we propose two methods for separating mixtures of independent sources without any precise knowledge of their probability distribution. They are obtained by considering a maximum likelihood solution corresponding to some given distributions of the sources and relaxing this assumption af ..."
Abstract

Cited by 99 (8 self)
 Add to MetaCart
In this paper we propose two methods for separating mixtures of independent sources without any precise knowledge of their probability distribution. They are obtained by considering a maximum likelihood solution corresponding to some given distributions of the sources and relaxing this assumption afterward. The first method is specially adapted to temporally independent non Gaussian sources and is based on the use of nonlinear separating functions. The second method is specially adapted to correlated sources with distinct spectra and is based on the use of linear separating filters. A theoretical analysis of the performance of the methods has been made. A simple procedure for choosing optimally the separating functions from a given linear space of functions is proposed. Further, in the second method, a simple implementation based on the simultaneous diagonalization of two symmetric matrices is provided. Finally, some numerical and simulation results are given illustrating the performan...
A Fast FixedPoint Algorithm for Independent Component Analysis of Complex Valued Signals
, 2000
"... Separation of complex valued signals is a frequently arising problem in signal processing. For example, separation of convolutively mixed source signals involves computations on complex valued signals. In this article it is assumed that the original, complex valued source signals are mutually statis ..."
Abstract

Cited by 84 (1 self)
 Add to MetaCart
Separation of complex valued signals is a frequently arising problem in signal processing. For example, separation of convolutively mixed source signals involves computations on complex valued signals. In this article it is assumed that the original, complex valued source signals are mutually statistically independent, and the problem is solved by the independent component analysis (ICA) model. ICA is a statistical method for transforming an observed multidimensional random vector into components that are mutually as independent as possible. In this article, a fast xedpoint type algorithm that is capable of separating complex valued, linearly mixed source signals is presented and its computational efficiency is shown by simulations. Also, the local consistency of the estimator given by the algorithm is proved.
Blind Separation of Instantaneous Mixture of Sources based on order statistics
 IEEE Trans. Signal Processing
, 1996
"... In this paper we introduce a novel procedure for separating an instantaneous mixture of source based on the order statistics. The method is derived in a general context of independence component analysis, using a contrast function defined in term of the KullbackLeibner divergence or of the mutual i ..."
Abstract

Cited by 62 (11 self)
 Add to MetaCart
In this paper we introduce a novel procedure for separating an instantaneous mixture of source based on the order statistics. The method is derived in a general context of independence component analysis, using a contrast function defined in term of the KullbackLeibner divergence or of the mutual information. We introduce a discretized form of this contrast permitting its easy estimation through the order statistics. We show that the local contrast property is preserved and also derive a global contrast exploiting only the information of the support of the distribution (in the case this support is finite). Some simulations are given illustrating the good performance of the method. 1 Introduction The problem of separation of sources has been the subject of rapid development in the signal processing literature recently (see for example [2]  [5], [7]  [12], [14], [15] : : : ). We consider here the simplest case where one observes K sequences X 1 (t), : : : , XK (t), each being a li...
A first application of independent component analysis to extracting structure from stock returns
 INTERNATIONAL JOURNAL OF NEURAL SYSTEMS
, 1997
"... This paper discusses the application of a modern signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series int ..."
Abstract

Cited by 57 (1 self)
 Add to MetaCart
This paper discusses the application of a modern signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). This can be viewed as a factorization of the portfolio since joint probabilities become simple products in the coordinate system of the ICs. We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent but large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. Independent component analysis is a potentially powerful method of analyzing and understanding driving mechanisms in financial markets. There are further
Neural Approaches to Independent Component Analysis and Source Separation
, 1996
"... Independent Component Analysis (ICA) is a recently developed technique that in many cases characterizes the data in a natural way. The main application area of the linear ICA model is blind source separation. Here, unknown source signals are estimated from their unknown linear mixtures using the str ..."
Abstract

Cited by 56 (9 self)
 Add to MetaCart
Independent Component Analysis (ICA) is a recently developed technique that in many cases characterizes the data in a natural way. The main application area of the linear ICA model is blind source separation. Here, unknown source signals are estimated from their unknown linear mixtures using the strong assumption that the sources are mutually independent. In practice, separation can be achieved by using suitable higherorder statistics or nonlinearities. Various neural approaches have recently been proposed for blind source separation and ICA. In this paper, these approaches and the respective learning algorithms are briefly reviewed, and some extensions of the basic ICA model are discussed. 1. Introduction A recent trend in neural network research is to study various forms of unsupervised learning beyond standard Principal Component Analysis (PCA). Such techniques are often called nonlinear PCA methods. They can be developed from various starting points, usually leading to different ...
Signal Separation by Nonlinear Hebbian Learning
, 1995
"... this paper, we introduce a neural network that can be used for both source separation and the estimation of the basis vectors of ICA. The remainder of the paper is organized as follows. The next section presents the necessary background on ICA and source separation. In the third section, we introduc ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
this paper, we introduce a neural network that can be used for both source separation and the estimation of the basis vectors of ICA. The remainder of the paper is organized as follows. The next section presents the necessary background on ICA and source separation. In the third section, we introduce and justify the basic neural network learning algorithms for signal separation. The fourth section provides mathematical analysis justifying the separation ability of the nonlinear PCA type learning algorithm. The fifth section then introduces the ICA neural network, a threelayer network whose layers perform input data whitening, separation, and ICA basis vector estimation, respectively. In the sixth section, we present experimental results. In the last section, the conclusions of this study are presented, and some possibilities for extending the data model are outlined.