Results 1  10
of
404
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1492 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
Blind Signal Separation: Statistical Principles
, 2003
"... Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual i ..."
Abstract

Cited by 390 (4 self)
 Add to MetaCart
Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array processing and data analysis, aiming at recovering unobserved signals or `sources' from observed mixtures (typically, the output of an array of sensors), exploiting only the assumption of mutual independence between the signals. The weakness of the assumptions makes it a powerful approach but requires to venture beyond familiar second order statistics. The objective of this paper is to review some of the approaches that have been recently developed to address this exciting problem, to show how they stem from basic principles and how they relate to each other.
Equivariant Adaptive Source Separation
 IEEE Trans. on Signal Processing
, 1996
"... Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI (Eq ..."
Abstract

Cited by 381 (10 self)
 Add to MetaCart
Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI (Equivariant Adaptive Separation via Independence) . The EASI algorithms are based on the idea of serial updating: this specific form of matrix updates systematically yields algorithms with a simple, parallelizable structure, for both real and complex mixtures. Most importantly, the performance of an EASI algorithm does not depend on the mixing matrix. In particular, convergence rates, stability conditions and interference rejection levels depend only on the (normalized) distributions of the source signals. Close form expressions of these quantities are given via an asymptotic performance analysis. This is completed by some numerical experiments illustrating the effectiveness of the proposed ap...
EEGLAB: an open source toolbox for analysis of singletrial EEG dynamics including independent component analysis
 J. Neurosci. Methods
"... Abstract: We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of singletrial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event i ..."
Abstract

Cited by 307 (32 self)
 Add to MetaCart
Abstract: We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of singletrial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multitrial ERPimage plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), Independent Component Analysis (ICA) and time/frequency decompositions including channel and component crosscoherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Toplayer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middlelayer functions allow users to customize data processing using command history and interactive ‘pop ’ functions. Experienced MATLAB users can use EEGLAB data structures and standalone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A ‘plugin ’ facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available
Independent Component Analysis Using an Extended Infomax Algorithm for Mixed SubGaussian and SuperGaussian Sources
, 1999
"... An extension of the infomax algorithm of Bell and Sejnowski (1995) is presented that is able to blindly separate mixed signals with sub and superGaussian source distributions. This was achieved by using a simple type of learning rule first derived by Girolami (1997) by choosing negentropy as a pro ..."
Abstract

Cited by 202 (21 self)
 Add to MetaCart
An extension of the infomax algorithm of Bell and Sejnowski (1995) is presented that is able to blindly separate mixed signals with sub and superGaussian source distributions. This was achieved by using a simple type of learning rule first derived by Girolami (1997) by choosing negentropy as a projection pursuit index. Parameterized probability distributions that have suband superGaussian regimes were used to derive a general learning rule that preserves the simple architecture proposed by Bell and Sejnowski (1995), is optimized using the natural gradient by Amari (1998), and uses the stability analysis of Cardoso and Laheld (1996) to switch between sub and superGaussian regimes. We demonstrate that the extended infomax algorithm is able to easily separate 20 sources with a variety of source distributions. Applied to highdimensional data from electroencephalographic (EEG) recordings, it is effective at separating artifacts such as eye blinks and line noise from weaker electrical ...
A Blind Source Separation Technique Using Second Order Statistics
, 1997
"... Separation of sources consists in recovering a set of signals of which only instantaneous linear mixtures are observed. In many situations, no a priori information on the mixing matrix is available: the linear mixture should be `blindly' processed. This typically occurs in narrowband array processi ..."
Abstract

Cited by 201 (6 self)
 Add to MetaCart
Separation of sources consists in recovering a set of signals of which only instantaneous linear mixtures are observed. In many situations, no a priori information on the mixing matrix is available: the linear mixture should be `blindly' processed. This typically occurs in narrowband array processing applications when the array manifold is unknown or distorted. This paper introduces a new source separation technique exploiting the time coherence of the source signals. In contrast to other previously reported techniques, the proposed approach relies only on stationary secondorder statistics, being based on a joint diagonalization of a set of covariance matrices. Asymptotic performance analysis of this method is carried out; some numerical simulations are provided to illustrate the effectiveness of the proposed method. I. Introduction I N many situations of practical interest, one has to process multidimensional observations of the form: x(t) = y(t) + n(t) = As(t) + n(t); (1) i.e. x...
From theory to practice: an overview of MIMO spacetime coded wireless systems
 IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS
, 2003
"... This paper presents an overview of recent progress in the area of multipleinput–multipleoutput (MIMO) space–time coded wireless systems. After some background on the research leading to the discovery of the enormous potential of MIMO wireless links, we highlight the different classes of technique ..."
Abstract

Cited by 199 (5 self)
 Add to MetaCart
This paper presents an overview of recent progress in the area of multipleinput–multipleoutput (MIMO) space–time coded wireless systems. After some background on the research leading to the discovery of the enormous potential of MIMO wireless links, we highlight the different classes of techniques and algorithms proposed which attempt to realize the various benefits of MIMO including spatial multiplexing and space–time coding schemes. These algorithms are often derived and analyzed under ideal independent fading conditions. We present the state of the art in channel modeling and measurements, leading to a better understanding of actual MIMO gains. Finally, the paper addresses current questions regarding the integration of MIMO links in practical wireless systems and standards.
HighOrder Contrasts for Independent Component Analysis
"... This article considers highorder measures of independence for the independent component analysis problem and discusses the class of Jacobi algorithms for their optimization. Several implementations are discussed. We compare the proposed approaches with gradientbased techniques from the algorithmic ..."
Abstract

Cited by 187 (4 self)
 Add to MetaCart
This article considers highorder measures of independence for the independent component analysis problem and discusses the class of Jacobi algorithms for their optimization. Several implementations are discussed. We compare the proposed approaches with gradientbased techniques from the algorithmic point of view and also on a set of biomedical data.
Blind Separation of Instantaneous Mixtures of Non Stationary Sources
 IEEE Trans. Signal Processing
, 2000
"... Most ICA algorithms are based on a model of stationary sources. This paper considers exploiting the (possible) nonstationarity of the sources to achieve separation. We introduce two objective functions based on the likelihood and on mutual information in a simple Gaussian non stationary model and w ..."
Abstract

Cited by 126 (11 self)
 Add to MetaCart
Most ICA algorithms are based on a model of stationary sources. This paper considers exploiting the (possible) nonstationarity of the sources to achieve separation. We introduce two objective functions based on the likelihood and on mutual information in a simple Gaussian non stationary model and we show how they can be optimized, offline or online, by simple yet remarkably efficient algorithms (one is based on a novel joint diagonalization procedure, the other on a Newtonlike technique). The paper also includes (limited) numerical experiments and a discussion contrasting nonGaussian and nonstationary models. 1. INTRODUCTION The aim of this paper is to develop a blind source separation procedure adapted to source signals with time varying intensity (such as speech signals). For simplicity, we shall restrict ourselves to the simplest mixture model: X(t) = AS(t) (1) where X(t) = [X 1 (t) XK (t)] T is the vector of observations (at time t), A is a fixed unknown K K inver...
Jacobi Angles For Simultaneous Diagonalization.
 SIAM J. Mat. Anal. Appl
, 1996
"... . Simultaneous diagonalization of several matrices can be implemented by a Jacobilike technique. This note gives the required Jacobi angles in close form. Key words. Simultaneous diagonalization, Jacobi iterations, eigenvalues, eigenvectors, structured eigenvalue problem. AMS subject classificati ..."
Abstract

Cited by 120 (3 self)
 Add to MetaCart
. Simultaneous diagonalization of several matrices can be implemented by a Jacobilike technique. This note gives the required Jacobi angles in close form. Key words. Simultaneous diagonalization, Jacobi iterations, eigenvalues, eigenvectors, structured eigenvalue problem. AMS subject classifications. 65F15, 6504. Introduction. Simultaneous diagonalization of several commuting matrices has been recently considered in [1], mainly motivated by stability and convergence concerns. Exact or approximate simultaneous diagonalization was also independently introduced as a solution to a statistical identification problem [2] (see [3] for a later paper in English). The simultaneous diagonalization algorithm described in these papers is an extension of the Jacobi technique: a joint diagonality criterion is iteratively optimized under plane rotations. The purpose of this note is to complement [1] by giving a close form expression for the optimal Jacobi angles. 1. Jacobi angles in close form. C...