Results 1  10
of
72
Independent Component Analysis
 Neural Computing Surveys
, 2001
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 1492 (93 self)
 Add to MetaCart
A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. 1
A first application of independent component analysis to extracting structure from stock returns
 INTERNATIONAL JOURNAL OF NEURAL SYSTEMS
, 1997
"... This paper discusses the application of a modern signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series int ..."
Abstract

Cited by 57 (1 self)
 Add to MetaCart
This paper discusses the application of a modern signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). This can be viewed as a factorization of the portfolio since joint probabilities become simple products in the coordinate system of the ICs. We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent but large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. Independent component analysis is a potentially powerful method of analyzing and understanding driving mechanisms in financial markets. There are further
A bayesian approach for blind separation of sparse sources
 IEEE Transactions on Speech and Audio Processing
, 2005
"... We present a Bayesian approach for blind separation of linear instantaneous mixtures of sources having a sparse representation in a given basis. The distributions of the coefficients of the sources in the basis are modeled by a Student t distribution, which can be expressed as a Scale Mixture of Gau ..."
Abstract

Cited by 49 (9 self)
 Add to MetaCart
We present a Bayesian approach for blind separation of linear instantaneous mixtures of sources having a sparse representation in a given basis. The distributions of the coefficients of the sources in the basis are modeled by a Student t distribution, which can be expressed as a Scale Mixture of Gaussians, and a Gibbs sampler is derived to estimate the sources, the mixing matrix, the input noise variance and also the hyperparameters of the Student t distributions. The method allows for separation of underdetermined (more sources than sensors) noisy mixtures. Results are presented with audio signals using a Modified Discrete Cosine Transfrom basis and compared with a finite mixture of Gaussians prior approach. These results show the improved sound quality obtained with the Student t prior and the better robustness to mixing matrices close to singularity of the Markov Chains Monte Carlo approach.
Beyond independent components: trees and clusters
 Journal of Machine Learning Research
, 2003
"... We present a generalization of independent component analysis (ICA), where instead of looking for a linear transform that makes the data components independent, we look for a transform that makes the data components well fit by a treestructured graphical model. This treedependent component analysi ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
We present a generalization of independent component analysis (ICA), where instead of looking for a linear transform that makes the data components independent, we look for a transform that makes the data components well fit by a treestructured graphical model. This treedependent component analysis (TCA) provides a tractable and flexible approach to weakening the assumption of independence in ICA. In particular, TCA allows the underlying graph to have multiple connected components, and thus the method is able to find “clusters ” of components such that components are dependent within a cluster and independent between clusters. Finally, we make use of a notion of graphical models for time series due to Brillinger (1996) to extend these ideas to the temporal setting. In particular, we are able to fit models that incorporate treestructured dependencies among multiple time series.
Survey of Sparse and NonSparse Methods in Source Separation
, 2005
"... Source separation arises in a variety of signal processing applications, ranging from speech processing to medical image analysis. The separation of a superposition of multiple signals is accomplished by taking into account the structure of the mixing process and by making assumptions about the sour ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
Source separation arises in a variety of signal processing applications, ranging from speech processing to medical image analysis. The separation of a superposition of multiple signals is accomplished by taking into account the structure of the mixing process and by making assumptions about the sources. When the information about the mixing process and sources is limited, the problem is called ‘blind’. By assuming that the sources can be represented sparsely in a given basis, recent research has demonstrated that solutions to previously problematic blind source separation problems can be obtained. In some cases, solutions are possible to problems intractable by previous nonsparse methods. Indeed, sparse methods provide a powerful approach to the separation of linear mixtures of independent data. This paper surveys the recent arrival of sparse blind source separation methods and the previously existing nonsparse methods, providing insights and appropriate hooks into the literature along the way.
Denoising Source Separation
"... A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constuct ..."
Abstract

Cited by 30 (6 self)
 Add to MetaCart
A new algorithmic framework called denoising source separation (DSS) is introduced. The main benefit of this framework is that it allows for easy development of new source separation algorithms which are optimised for specific problems. In this framework, source separation algorithms are constucted around denoising procedures. The resulting algorithms can range from almost blind to highly specialised source separation algorithms. Both simple linear and more complex nonlinear or adaptive denoising schemes are considered. Some existing independent component analysis algorithms are reinterpreted within DSS framework and new, robust blind source separation algorithms are suggested. Although DSS algorithms need not be explicitly based on objective functions, there is often an implicit objective function that is optimised. The exact relation between the denoising procedure and the objective function is derived and a useful approximation of the objective function is presented. In the experimental section, various DSS schemes are applied extensively to artificial data, to real magnetoencephalograms and to simulated CDMA mobile network signals. Finally, various extensions to the proposed DSS algorithms are considered. These include nonlinear observation mappings, hierarchical models and overcomplete, nonorthogonal feature spaces. With these extensions, DSS appears to have relevance to many existing models of neural information processing.
Advances in nonlinear blind source separation
 In Proc. of the 4th Int. Symp. on Independent Component Analysis and Blind Signal Separation (ICA2003
, 2003
"... Abstract — In this paper, we briefly review recent advances in blind source separation (BSS) for nonlinear mixing models. After a general introduction to the nonlinear BSS and ICA (independent Component Analysis) problems, we discuss in more detail uniqueness issues, presenting some new results. A f ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
Abstract — In this paper, we briefly review recent advances in blind source separation (BSS) for nonlinear mixing models. After a general introduction to the nonlinear BSS and ICA (independent Component Analysis) problems, we discuss in more detail uniqueness issues, presenting some new results. A fundamental difficulty in the nonlinear BSS problem and even more so in the nonlinear ICA problem is that they are nonunique without extra constraints, which are often implemented by using a suitable regularization. Postnonlinear mixtures are an important special case, where a nonlinearity is applied to linear mixtures. For such mixtures, the ambiguities are essentially the same as for the linear ICA or BSS problems. In the later part of this paper, various separation techniques proposed for postnonlinear mixtures and general nonlinear mixtures are reviewed. I. THE NONLINEAR ICA AND BSS PROBLEMS Consider Æ samples of the observed data vector Ü, modeled by
Extraction of Specific Signals with Temporal Structure
, 2001
"... this article, we propose a simple batch algorithm that guarantees blind extraction of any source signal s i that satisfies for the specific time delay # i the following relations: E[s i (k)s i (k E[s i (k)s j (k i j. (2.1) Because we want to extract only a desired source signal, we c ..."
Abstract

Cited by 29 (7 self)
 Add to MetaCart
this article, we propose a simple batch algorithm that guarantees blind extraction of any source signal s i that satisfies for the specific time delay # i the following relations: E[s i (k)s i (k E[s i (k)s j (k i j. (2.1) Because we want to extract only a desired source signal, we can use a simple processing unit described as y(k) x(k), where y(k) is the output signal (which estimates the specific source signal s i ), k is the sampling number, andw is the weight vector. For the purpose of developing the algorithm, let us first define the following error, #(k) y(k) by(k p), (2.2) where b is a coefficient of a simple FIR filter with single delay z pT s , where T s is the sampling period (assumed to be one)
A Fast Algorithm for Joint Diagonalization with Nonorthogonal Transformations and its Application to Blind Source Separation
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2004
"... A new efficient algorithm is presented for joint diagonalization of several matrices. The algorithm is based on the Frobeniusnorm formulation of the joint diagonalization problem, and addresses diagonalization with a general, nonorthogonal transformation. The iterative scheme of the algorithm i ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
A new efficient algorithm is presented for joint diagonalization of several matrices. The algorithm is based on the Frobeniusnorm formulation of the joint diagonalization problem, and addresses diagonalization with a general, nonorthogonal transformation. The iterative scheme of the algorithm is based on a multiplicative update which ensures the invertibility of the diagonalizer. The algorithm 's efficiency stems from the special approximation of the cost function resulting in a sparse, blockdiagonal Hessian to be used in the computation of the quasiNewton update step. Extensive numerical simulations illustrate the performance of the algorithm and provide a comparison to other leading diagonalization methods. The results of such comparison demonstrate that the proposed algorithm is a viable alternative to existing stateoftheart joint diagonalization algorithms.
Free deconvolution for signal processing applications
 IEEE Trans. Inform. Theory
"... Abstract—Situations in many fields of research, such as digital communications, nuclear physics and mathematical finance, can be modelled with random matrices. When the matrices get large, free probability theory is an invaluable tool for describing the asymptotic behaviour of many systems. It will ..."
Abstract

Cited by 23 (14 self)
 Add to MetaCart
Abstract—Situations in many fields of research, such as digital communications, nuclear physics and mathematical finance, can be modelled with random matrices. When the matrices get large, free probability theory is an invaluable tool for describing the asymptotic behaviour of many systems. It will be explained how free probability can be used to estimate covariance matrices. Multiplicative free deconvolution is shown to be a method which can aid in expressing limit eigenvalue distributions for sample covariance matrices, and to simplify estimators for eigenvalue distributions of covariance matrices. Index Terms—Free Probability Theory, Random Matrices, deconvolution, limiting eigenvalue distribution, Ganalysis.