Results 1  10
of
255
Kernel independent component analysis
 Journal of Machine Learning Research
, 2002
"... We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical propert ..."
Abstract

Cited by 464 (24 self)
 Add to MetaCart
We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical properties as measures of statistical dependence. On the other hand, building on recent developments in kernel methods, we show that these criteria can be computed efficiently. Minimizing these criteria leads to flexible and robust algorithms for ICA. We illustrate with simulations involving a wide variety of source distributions, showing that our algorithms outperform many of the presently known algorithms. 1.
Blind Source Separation by Sparse Decomposition in a Signal Dictionary
, 2000
"... Introduction In blind source separation an Nchannel sensor signal x(t) arises from M unknown scalar source signals s i (t), linearly mixed together by an unknown N M matrix A, and possibly corrupted by additive noise (t) x(t) = As(t) + (t) (1.1) We wish to estimate the mixing matrix A and the M ..."
Abstract

Cited by 274 (34 self)
 Add to MetaCart
Introduction In blind source separation an Nchannel sensor signal x(t) arises from M unknown scalar source signals s i (t), linearly mixed together by an unknown N M matrix A, and possibly corrupted by additive noise (t) x(t) = As(t) + (t) (1.1) We wish to estimate the mixing matrix A and the Mdimensional source signal s(t). Many natural signals can be sparsely represented in a proper signal dictionary s i (t) = K X k=1 C ik ' k (t) (1.2) The scalar functions ' k
Convolutive speech bases and their application to supervised speech separation
 IEEE Transactions on Audio, Speech and Language Processing
, 2007
"... In this paper we present a convolutive basis decomposition method and its application on simultaneous speakers separation from monophonic recordings. The model we propose is a convolutive version of the nonnegative matrix factorization algorithm. Due to the nonnegativity constraint this type of co ..."
Abstract

Cited by 94 (7 self)
 Add to MetaCart
(Show Context)
In this paper we present a convolutive basis decomposition method and its application on simultaneous speakers separation from monophonic recordings. The model we propose is a convolutive version of the nonnegative matrix factorization algorithm. Due to the nonnegativity constraint this type of coding is very well suited for intuitively and efficiently representing magnitude spectra. We present results that reveal the nature of these basis functions and we introduce their utility in separating monophonic mixtures of known speakers.
Algorithms for numerical analysis in high dimensions
 SIAM J. SCI. COMPUT
, 2005
"... Nearly every numerical analysis algorithm has computational complexity that scales exponentially in the underlying physical dimension. The separated representation, introduced previously, allows many operations to be performed with scaling that is formally linear in the dimension. In this paper we ..."
Abstract

Cited by 90 (11 self)
 Add to MetaCart
Nearly every numerical analysis algorithm has computational complexity that scales exponentially in the underlying physical dimension. The separated representation, introduced previously, allows many operations to be performed with scaling that is formally linear in the dimension. In this paper we further develop this representation by: (i) discussing the variety of mechanisms that allow it to be surprisingly efficient; (ii) addressing the issue of conditioning; (iii) presenting algorithms for solving linear systems within this framework; and (iv) demonstrating methods for dealing with antisymmetric functions, as arise in the multiparticle Schrödinger equation in quantum mechanics. Numerical examples are given.
ICA Using Spacings Estimates of Entropy
 Journal of Machine Learning Research
, 2003
"... This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated KullbackLeibler divergence betwee ..."
Abstract

Cited by 74 (3 self)
 Add to MetaCart
(Show Context)
This paper presents a new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator. Like many previous methods, this algorithm directly minimizes the measure of departure from independence according to the estimated KullbackLeibler divergence between the joint distribution and the product of the marginal distributions. We pair this approach with efficient entropy estimators from the statistics literature. In particular, the entropy estimator we use is consistent and exhibits rapid convergence. The algorithm based on this estimator is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. In addition, the estimator's relative insensitivity to outliers translates into superior performance by our ICA algorithm on outlier tests. We present favorable comparisons to the Kernel ICA, FASTICA, JADE, and extended Infomax algorithms in extensive simulations. We also provide public domain source code for our algorithms.
Beyond independent components: trees and clusters
 Journal of Machine Learning Research
, 2003
"... We present a generalization of independent component analysis (ICA), where instead of looking for a linear transform that makes the data components independent, we look for a transform that makes the data components well fit by a treestructured graphical model. This treedependent component analysi ..."
Abstract

Cited by 56 (0 self)
 Add to MetaCart
(Show Context)
We present a generalization of independent component analysis (ICA), where instead of looking for a linear transform that makes the data components independent, we look for a transform that makes the data components well fit by a treestructured graphical model. This treedependent component analysis (TCA) provides a tractable and flexible approach to weakening the assumption of independence in ICA. In particular, TCA allows the underlying graph to have multiple connected components, and thus the method is able to find “clusters ” of components such that components are dependent within a cluster and independent between clusters. Finally, we make use of a notion of graphical models for time series due to Brillinger (1996) to extend these ideas to the temporal setting. In particular, we are able to fit models that incorporate treestructured dependencies among multiple time series.
Principal Manifolds and Bayesian Subspaces for Visual Recognition
 International Conference on Computer Vision
, 1999
"... We investigate the use of linear and nonlinear principal manifolds for learning lowdimensional representations for visual recognition. Three techniques: Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Nonlinear PCA (NLPCA) are examined and tested in a visual recognition ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
(Show Context)
We investigate the use of linear and nonlinear principal manifolds for learning lowdimensional representations for visual recognition. Three techniques: Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Nonlinear PCA (NLPCA) are examined and tested in a visual recognition experiment using a large gallery of facial images from the ¨FERET¨database. We compare the recognition performance of a nearestneighbour matching rule with each principal manifold representation to that of a maximum a posteriori (MAP) matching rule using a Bayesian similarity measure derived from probabilistic subspaces and demonstrate the superiority of the latter.
Chromatic structure of natural scenes
, 2001
"... We applied independent component analysis (ICA) to hyperspectral images in order to learn an efficient representation of color in natural scenes. In the spectra of single pixels, the algorithm found basis functions that had broadband spectra and basis functions that were similar to natural reflectan ..."
Abstract

Cited by 50 (5 self)
 Add to MetaCart
We applied independent component analysis (ICA) to hyperspectral images in order to learn an efficient representation of color in natural scenes. In the spectra of single pixels, the algorithm found basis functions that had broadband spectra and basis functions that were similar to natural reflectance spectra. When applied to small image patches, the algorithm found some basis functions that were achromatic and others with overall chromatic variation along lines in color space, indicating color opponency. The directions of opponency were not strictly orthogonal. Comparison with principalcomponent analysis on the basis of statistical measures such as average mutual information, kurtosis, and entropy, shows that the ICA transformation results in much sparser coefficients and gives higher coding efficiency. Our findings suggest that nonorthogonal opponent encoding of photoreceptor signals leads to higher coding efficiency and that ICA may be used to reveal the underlying statistical properties of color information in natural scenes.
Denoising Source Separation
"... A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constuct ..."
Abstract

Cited by 49 (7 self)
 Add to MetaCart
(Show Context)
A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted around denoising procedures. The resulting algorithms can range from almost blind to highly specialised source separation algorithms. Both simple linear and more complex nonlinear or adaptive denoising schemes are considered. Some existing independent component analysis algorithms are reinterpreted within DSS framework and new, robust blind source separation algorithms are suggested. Although DSS algorithms need not be explicitly based on objective functions, there is often an implicit objective function that is optimised. The exact relation between the denoising procedure and the objective function is derived and a useful approximation of the objective function is presented. In the experimental section, various DSS schemes are applied extensively to artificial data, to real magnetoencephalograms and to simulated CDMA mobile network signals. Finally, various extensions to the proposed DSS algorithms are considered. These include nonlinear observation mappings, hierarchical models and overcomplete, nonorthogonal feature spaces. With these extensions, DSS appears to have relevance to many existing models of neural information processing.
Tensor decompositions, state of the art and applications
 MATHEMATICS IN SIGNAL PROCESSING V
"... ..."