Results 1  10
of
303
Independent component analysis: algorithms and applications
 NEURAL NETWORKS
, 2000
"... ..."
(Show Context)
Kernel independent component analysis
 Journal of Machine Learning Research
, 2002
"... We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical propert ..."
Abstract

Cited by 465 (27 self)
 Add to MetaCart
We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical properties as measures of statistical dependence. On the other hand, building on recent developments in kernel methods, we show that these criteria can be computed efficiently. Minimizing these criteria leads to flexible and robust algorithms for ICA. We illustrate with simulations involving a wide variety of source distributions, showing that our algorithms outperform many of the presently known algorithms. 1.
Face recognition by independent component analysis
 IEEE Transactions on Neural Networks
, 2002
"... Abstract—A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such ..."
Abstract

Cited by 333 (5 self)
 Add to MetaCart
(Show Context)
Abstract—A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such methods. The basis images found by PCA depend only on pairwise relationships between pixels in the image database. In a task such as face recognition, in which important information may be contained in the highorder relationships among pixels, it seems reasonable to expect that better basis images may be found by methods sensitive to these highorder statistics. Independent component analysis (ICA), a generalization of PCA, is one such method. We used a version of ICA derived from the principle of optimal information transfer through sigmoidal neurons. ICA was performed on face images in the FERET database under two different architectures, one which treated the images as random variables and the pixels as outcomes, and a second which treated the pixels as random variables and the images as outcomes. The first architecture found spatially local basis images for the faces. The second architecture produced a factorial face code. Both ICA representations were superior to representations based on PCA for recognizing faces across days and changes in expression. A classifier that combined the two ICA representations gave the best performance. Index Terms—Eigenfaces, face recognition, independent component analysis (ICA), principal component analysis (PCA), unsupervised learning. I.
Removing Electroencephalographic Artifacts: Comparison between ICA and PCA
, 1998
"... Pervasive electroencephalographic (EEG) artifacts associated with blinks, eyemovements, muscle noise, cardiac signals, and line noise poses a major challenge for EEG interpretation and analysis. Here, we propose a generally applicable method for removing a wide variety of artifacts from EEG records ..."
Abstract

Cited by 226 (22 self)
 Add to MetaCart
(Show Context)
Pervasive electroencephalographic (EEG) artifacts associated with blinks, eyemovements, muscle noise, cardiac signals, and line noise poses a major challenge for EEG interpretation and analysis. Here, we propose a generally applicable method for removing a wide variety of artifacts from EEG records based on an extended version of an Independent Component Analysis (ICA) algorithm [2, 12] for performing blind source separation on linear mixtures of independent source signals. Our results show that ICA can effectively separate and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably to those obtained using Principal Component Analysis. 1 INTRODUCTION Since the landmark development of electroencephalography (EEG) in 1928 by Berger, scalp EEG has been used as a clinical tool for the diagnosis and treatment of brain diseases, and used as a noninvasive approach for research in the quantitative study of human neurophysiology. Ironic...
A Unifying Informationtheoretic Framework for Independent Component Analysis
, 1999
"... We show that different theories recently proposed for Independent Component Analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pea ..."
Abstract

Cited by 104 (8 self)
 Add to MetaCart
We show that different theories recently proposed for Independent Component Analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pearlmutter and Parra (1996) and Cardoso (1997) showed that the infomax approach of Bell and Sejnowski (1995) and the maximum likelihood estimation approach are equivalent. We show that negentropy maximization also has equivalent properties and therefore all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe (1997a) have shown that the nonlinear Principal Component Analysis (PCA) algorithm of Karhunen and Joutsensalo (1994) and Oja (1997) can also be viewed from informationtheoretic principles since it minimizes the sum of squares of the fourthorder marginal cumulants and therefore approximately minimizes the mutual information (Comon, 1994). Lambert (19...
Imaging brain dynamics using independent component analysis
 Proceedings of the IEEE
"... The analysis of electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings is important both for basic brain research and for medical diagnosis and treatment. Independent component analysis (ICA) is an effective method for removing artifacts and separating sources of the brain signal ..."
Abstract

Cited by 75 (25 self)
 Add to MetaCart
(Show Context)
The analysis of electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings is important both for basic brain research and for medical diagnosis and treatment. Independent component analysis (ICA) is an effective method for removing artifacts and separating sources of the brain signals from these recordings. A similar approach is proving useful for analyzing functional magnetic resonance brain imaging (fMRI) data. In this paper, we outline the assumptions underlying ICA and demonstrate its application to a variety of electrical and hemodynamic recordings from the human brain. Keywords—Blind source separation, EEG, fMRI, independent component analysis.
Frontal midline EEG dynamics during working memory
 NeuroImage
, 2005
"... We show that during visual working memory, the electroencephalographic (EEG) process producing 5 – 7 Hz frontal midline theta (fmQ) activity exhibits multiple spectral modes involving at least three frequency bands and a wide range of amplitudes. The process accounting for the fmQ increase during wo ..."
Abstract

Cited by 71 (16 self)
 Add to MetaCart
(Show Context)
We show that during visual working memory, the electroencephalographic (EEG) process producing 5 – 7 Hz frontal midline theta (fmQ) activity exhibits multiple spectral modes involving at least three frequency bands and a wide range of amplitudes. The process accounting for the fmQ increase during working memory was separated from 71channel data by clustering on time/frequency transforms of components returned by independent component analysis (ICA). Dipole models of fmQ component scalp maps were consistent with their generation in or near dorsal anterior cingulate cortex. From trial to trial, theta power of fmQ components varied widely but correlated moderately with theta power in other frontal and left temporal processes. The weak mean increase in frontal midline theta power with increasing memory load, produced entirely by the fmQ components, largely reflected progressively stronger theta activity in a relatively small proportion of trials. During presentations of letter series to be memorized or ignored, fmQ components also exhibited 12– 15 Hz lowbeta activity that was stronger during memorized than during ignored letter trials, independent of letter duration. The same components produced a brief 3Hz burst 500 ms after onset of the probe letter following each letter sequence. A new decomposition method, log spectral ICA, applied to normalized log time/frequency transforms of fmQ component Memorizeletter trials, showed that their lowbeta activity reflected harmonic energy in continuous, sharppeaked theta wave trains as well as independent lowbeta bursts. Possibly, the observed fmQ process variability may index dynamic adjustments in medial frontal cortex to trialspecific behavioral context and task demands.
ICA Using Spacings Estimates of Entropy
 Journal of Machine Learning Research
, 2003
"... This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated KullbackLeibler divergence betwee ..."
Abstract

Cited by 68 (3 self)
 Add to MetaCart
(Show Context)
This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated KullbackLeibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FASTICA, JADE, and extended Infomax algorithms in extensive simulations. We also provide public domain source code for our algorithms.
Micromixers – a review
 Journal of Micromechnics and Microengineering
"... and Professor Jouni Jaakkola, MD, PhD ..."
(Show Context)