• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Independent component analysis using an extended infomax algorithm for mixed sub-Gaussian and super-Gaussian sources (1999)

by T-W Lee
Venue:Neural Comput
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 314
Next 10 →

Independent component analysis: algorithms and applications

by A. Hyvärinen, E. Oja - NEURAL NETWORKS , 2000
"... ..."
Abstract - Cited by 851 (10 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...e estimated correctly. They need not be estimated with any great precision: in fact it is enough to estimate whether they are sub- or supergaussian (Cardoso and Laheld, 1996; Hyvärinen and Oja, 1998; =-=Lee et al., 1999-=-). In many cases, in fact, we have enough prior knowledge on the independent components, and we don’t need to estimate their nature from the data. In any case, if the information on the nature of the ...

Kernel independent component analysis

by Francis R. Bach - Journal of Machine Learning Research , 2002
"... We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical propert ..."
Abstract - Cited by 464 (24 self) - Add to MetaCart
We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical properties as measures of statistical dependence. On the other hand, building on recent developments in kernel methods, we show that these criteria can be computed efficiently. Minimizing these criteria leads to flexible and robust algorithms for ICA. We illustrate with simulations involving a wide variety of source distributions, showing that our algorithms outperform many of the presently known algorithms. 1.

Face recognition by independent component analysis

by Marian Stewart Bartlett, Javier R. Movellan, Terrence J. Sejnowski - IEEE Transactions on Neural Networks , 2002
"... Abstract—A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such ..."
Abstract - Cited by 348 (5 self) - Add to MetaCart
Abstract—A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such methods. The basis images found by PCA depend only on pairwise relationships between pixels in the image database. In a task such as face recognition, in which important information may be contained in the high-order relationships among pixels, it seems reasonable to expect that better basis images may be found by methods sensitive to these high-order statistics. Independent component analysis (ICA), a generalization of PCA, is one such method. We used a version of ICA derived from the principle of optimal information transfer through sigmoidal neurons. ICA was performed on face images in the FERET database under two different architectures, one which treated the images as random variables and the pixels as outcomes, and a second which treated the pixels as random variables and the images as outcomes. The first architecture found spatially local basis images for the faces. The second architecture produced a factorial face code. Both ICA representations were superior to representations based on PCA for recognizing faces across days and changes in expression. A classifier that combined the two ICA representations gave the best performance. Index Terms—Eigenfaces, face recognition, independent component analysis (ICA), principal component analysis (PCA), unsupervised learning. I.
(Show Context)

Citation Context

...d, ICA can be seen as doing something akin to nonorthogonal PCA and to cluster analysis, however, when the source models are sub-Gaussian, the relationship between these techniques is less clear. See =-=[30]-=- for a discussion of ICA in the context of sub-Gaussian sources. B. Two Architectures for Performing ICA on Images Let be a data matrix with rows and columns. We can think of each column of as outcome...

Removing Electroencephalographic Artifacts: Comparison between ICA and PCA

by Tzyy-Ping Jung, Colin Humphries, Te-won Lee, Scott Makeig, Martin J. Mckeown, Vicente Iragui, Terrence J. Sejnowski , 1998
"... Pervasive electroencephalographic (EEG) artifacts associated with blinks, eye-movements, muscle noise, cardiac signals, and line noise poses a major challenge for EEG interpretation and analysis. Here, we propose a generally applicable method for removing a wide variety of artifacts from EEG records ..."
Abstract - Cited by 240 (22 self) - Add to MetaCart
Pervasive electroencephalographic (EEG) artifacts associated with blinks, eye-movements, muscle noise, cardiac signals, and line noise poses a major challenge for EEG interpretation and analysis. Here, we propose a generally applicable method for removing a wide variety of artifacts from EEG records based on an extended version of an Independent Component Analysis (ICA) algorithm [2, 12] for performing blind source separation on linear mixtures of independent source signals. Our results show that ICA can effectively separate and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably to those obtained using Principal Component Analysis. 1 INTRODUCTION Since the landmark development of electroencephalography (EEG) in 1928 by Berger, scalp EEG has been used as a clinical tool for the diagnosis and treatment of brain diseases, and used as a non-invasive approach for research in the quantitative study of human neurophysiology. Ironic...
(Show Context)

Citation Context

...use line noise is subGaussian, the original ICA algorithm ~Bell & Sejnowski, 1995!, without the extension to sub-Gaussian sources, did not coalesce the line noise in the data into a single component ~=-=Lee et al., 1999-=-!. ICA decomposition may be useful as well for observing fine details of the spatial structure of ongoing EEG activity in multiple brain areas or neural populations ~Jung et al., 1997; Makeig, Jung, B...

A Unifying Information-theoretic Framework for Independent Component Analysis

by Te-won Lee, Mark Girolami, Anthony J. Bell, Terrence J. Sejnowski , 1999
"... We show that different theories recently proposed for Independent Component Analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pea ..."
Abstract - Cited by 109 (8 self) - Add to MetaCart
We show that different theories recently proposed for Independent Component Analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pearlmutter and Parra (1996) and Cardoso (1997) showed that the infomax approach of Bell and Sejnowski (1995) and the maximum likelihood estimation approach are equivalent. We show that negentropy maximization also has equivalent properties and therefore all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe (1997a) have shown that the nonlinear Principal Component Analysis (PCA) algorithm of Karhunen and Joutsensalo (1994) and Oja (1997) can also be viewed from information-theoretic principles since it minimizes the sum of squares of the fourth-order marginal cumulants and therefore approximately minimizes the mutual information (Comon, 1994). Lambert (19...

Face Image Analysis by Unsupervised Learning and Redundancy Reduction

by Marian Stewart Bartlett - , 1998
"... ..."
Abstract - Cited by 94 (16 self) - Add to MetaCart
Abstract not found

Imaging brain dynamics using independent component analysis

by Tzyy-ping Jung, Scott Makeig, Martin J. Mckeown, Anthony J. Bell, Te-won Lee, Terrence, J. Sejnowski - Proceedings of the IEEE
"... The analysis of electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings is important both for basic brain research and for medical diagnosis and treatment. Independent component analysis (ICA) is an effective method for removing artifacts and separating sources of the brain signal ..."
Abstract - Cited by 77 (25 self) - Add to MetaCart
The analysis of electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings is important both for basic brain research and for medical diagnosis and treatment. Independent component analysis (ICA) is an effective method for removing artifacts and separating sources of the brain signals from these recordings. A similar approach is proving useful for analyzing functional magnetic resonance brain imaging (fMRI) data. In this paper, we outline the assumptions underlying ICA and demonstrate its application to a variety of electrical and hemodynamic recordings from the human brain. Keywords—Blind source separation, EEG, fMRI, independent component analysis.
(Show Context)

Citation Context

...gs from the human brain. Keywords—Blind source separation, EEG, fMRI, independent component analysis. I. INTRODUCTION Independent component analysis (ICA) refers to a family of related algorithms [1=-=]–[10]-=- that exploit independence to perform blind source separation. In Section II, an ICA algorithm based on the Infomax principle [6] is briefly introduced. In Section III, ICA is applied to electroenceph...

ICA Using Spacings Estimates of Entropy

by Erik G. Learned-miller, John W. Fisher Iii, Te-won Lee, Jean-francois Cardoso, Erkki Oja, Shun-ichi Amari - Journal of Machine Learning Research , 2003
"... This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence betwee ..."
Abstract - Cited by 74 (3 self) - Add to MetaCart
This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated Kullback-Leibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FAST-ICA, JADE, and extended Infomax algorithms in extensive simulations. We also provide public domain source code for our algorithms.
(Show Context)

Citation Context

...py, between the joint density of {Y 1 , . . . ,Y D } and the product of its marginals. The utility of (1) for purposes of the ICA problem has been well documented in the literature (c.f. Comon, 1994, =-=Lee et al., 1999-=-a). Briefly, we note that for mutually independent random variablessY 1 ,Y 2 , ...,Y D we have J(Y ) = # p(y 1 , y 2 , ..., y D )log p(y 1 , y 2 , ..., y D ) p(y 1 )p(y 2 )...p(y D ) d = # p(y 1 , y 2...

Frontal midline EEG dynamics during working memory

by Julie Onton, Arnaud Delorme, Scott Makeig - NeuroImage , 2005
"... We show that during visual working memory, the electroencephalographic (EEG) process producing 5 – 7 Hz frontal midline theta (fmQ) activity exhibits multiple spectral modes involving at least three frequency bands and a wide range of amplitudes. The process accounting for the fmQ increase during wo ..."
Abstract - Cited by 74 (16 self) - Add to MetaCart
We show that during visual working memory, the electroencephalographic (EEG) process producing 5 – 7 Hz frontal midline theta (fmQ) activity exhibits multiple spectral modes involving at least three frequency bands and a wide range of amplitudes. The process accounting for the fmQ increase during working memory was separated from 71-channel data by clustering on time/frequency transforms of components returned by independent component analysis (ICA). Dipole models of fmQ component scalp maps were consistent with their generation in or near dorsal anterior cingulate cortex. From trial to trial, theta power of fmQ components varied widely but correlated moderately with theta power in other frontal and left temporal processes. The weak mean increase in frontal midline theta power with increasing memory load, produced entirely by the fmQ components, largely reflected progressively stronger theta activity in a relatively small proportion of trials. During presentations of letter series to be memorized or ignored, fmQ components also exhibited 12– 15 Hz low-beta activity that was stronger during memorized than during ignored letter trials, independent of letter duration. The same components produced a brief 3-Hz burst 500 ms after onset of the probe letter following each letter sequence. A new decomposition method, log spectral ICA, applied to normalized log time/frequency transforms of fmQ component Memorize-letter trials, showed that their low-beta activity reflected harmonic energy in continuous, sharppeaked theta wave trains as well as independent low-beta bursts. Possibly, the observed fmQ process variability may index dynamic adjustments in medial frontal cortex to trial-specific behavioral context and task demands.
(Show Context)

Citation Context

...fact components that could not be localized (12%). presence of eye-blink or eye-movement artifacts was not a criterion for rejection. All remaining data epochs were submitted to extended infomax ICA (=-=Lee et al., 1999-=-) using runica (Makeig et al., 1997) from the EEGLAB toolbox. ICA (Bell and Sejnowski, 1995; Makeig et al., 1996) finds an Funmixing_ matrix (W) that linearly unmixes the original EEG channel data (x)...

Chromatic structure of natural scenes

by Thomas Wachtler, Te-won Lee, Terrence J. Sejnowski , 2001
"... We applied independent component analysis (ICA) to hyperspectral images in order to learn an efficient representation of color in natural scenes. In the spectra of single pixels, the algorithm found basis functions that had broadband spectra and basis functions that were similar to natural reflectan ..."
Abstract - Cited by 50 (5 self) - Add to MetaCart
We applied independent component analysis (ICA) to hyperspectral images in order to learn an efficient representation of color in natural scenes. In the spectra of single pixels, the algorithm found basis functions that had broadband spectra and basis functions that were similar to natural reflectance spectra. When applied to small image patches, the algorithm found some basis functions that were achromatic and others with overall chromatic variation along lines in color space, indicating color opponency. The directions of opponency were not strictly orthogonal. Comparison with principal-component analysis on the basis of statistical measures such as average mutual information, kurtosis, and entropy, shows that the ICA transformation results in much sparser coefficients and gives higher coding efficiency. Our findings suggest that nonorthogonal opponent encoding of photoreceptor signals leads to higher coding efficiency and that ICA may be used to reveal the underlying statistical properties of color information in natural scenes.
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University