Results 1  10
of
41
Supervised nonlinear spectral unmixing using a postnonlinear mixing model for hyperspectral images,” Univ
, 2011
"... Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polyno ..."
Abstract

Cited by 35 (19 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polynomial functions leading to a polynomial postnonlinear mixing model. A Bayesian algorithm and optimization methods are proposed to estimate the parameters involved in the model.Theperformanceoftheunmixing strategies is evaluated by simulations conducted on synthetic and real data. Index Terms—Hyperspectral imagery, postnonlinear model, spectral unmixing (SU). I.
Blind Separation of Postnonlinear Mixtures using Linearizing Transformations and Temporal Decorrelation
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2003
"... We propose two methods that reduce the postnonlinear blind source separation problem (PNLBSS) to a linear BSS problem. The first method is based on the concept of maximal correlation: we apply the alternating conditional expectation (ACE) algorithma powerful technique from nonparametric stati ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
We propose two methods that reduce the postnonlinear blind source separation problem (PNLBSS) to a linear BSS problem. The first method is based on the concept of maximal correlation: we apply the alternating conditional expectation (ACE) algorithma powerful technique from nonparametric statisticsto approximately invert the componentwise nonlinear functions. The second method is a Gaussianizing transformation, which is motivated by the fact that linearly mixed signals before nonlinear transformation are approximately Gaussian distributed. This heuristic, but simple and efficient procedure works as good as the ACE method. Using the framework provided by ACE, convergence can be proven. The optimal transformations obtained by ACE coincide with the soughtafter inverse functions of the nonlinearities. After equalizing the nonlinearities, temporal decorrelation separation (TDSEP) allows us to recover the source signals. Numerical simulations testing "ACETD" and "GaussTD" on realistic examples are performed with excellent results.
Nonlinear Independent Factor Analysis by Hierarchical Models
 in Proc. 4th Int. Symp. on Independent Component Analysis and Blind Signal Separation (ICA2003
, 2003
"... The building blocks introduced earlier by us in [1] are used for constructing a hierarchical nonlinear model for nonlinear factor analysis. We call the resulting method hierarchical nonlinear factor analysis (HNFA). The variational Bayesian learning algorithm used in this method has a linear computa ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
(Show Context)
The building blocks introduced earlier by us in [1] are used for constructing a hierarchical nonlinear model for nonlinear factor analysis. We call the resulting method hierarchical nonlinear factor analysis (HNFA). The variational Bayesian learning algorithm used in this method has a linear computational complexity, and it is able to infer the structure of the model in addition to estimating the unknown parameters. We show how nonlinear mixtures can be separated by first estimating a nonlinear subspace using HNFA and then rotating the subspace using linear independent component analysis. Experimental results show that the cost function minimised during learning predicts well the quality of the estimated subspace.
Nonlinear Unmixing of Hyperspectral Data Based on a LinearMixture/NonlinearFluctuation Model
"... Abstract—Spectral unmixing is an important issue to analyze remotely sensed hyperspectral data. Although the linear mixture model has obvious practical advantages, there are many situations in which it may not be appropriate and could be advantageously replaced by a nonlinear one. In this paper, we ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
(Show Context)
Abstract—Spectral unmixing is an important issue to analyze remotely sensed hyperspectral data. Although the linear mixture model has obvious practical advantages, there are many situations in which it may not be appropriate and could be advantageously replaced by a nonlinear one. In this paper, we formulate a new kernelbased paradigm that relies on the assumption that the mixing mechanism can be described by a linear mixture of endmember spectra, with additive nonlinear fluctuations defined in a reproducing kernel Hilbert space. This family of models has clear interpretation, and allows to take complex interactions of endmembers into account. Extensive experiment results, with both synthetic and real images, illustrate the generality and effectiveness of this scheme compared with stateoftheart methods. Index Terms — Hyperspectral imaging, multikernel learning, nonlinear spectral unmixing, support vector regression.
Nonlinearity Detection in Hyperspectral Images Using a Polynomial PostNonlinear Mixing Model
, 2013
"... OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible.
What Is the Relation Between Slow Feature Analysis and Independent Component Analysis?
, 2006
"... We present an analytical comparison between linear slow feature analysis and secondorder independent component analysis, and show that in the case of one time delay, the two approaches are equivalent. We also consider the case of several time delays and discuss two possible extensions of slow featu ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
We present an analytical comparison between linear slow feature analysis and secondorder independent component analysis, and show that in the case of one time delay, the two approaches are equivalent. We also consider the case of several time delays and discuss two possible extensions of slow feature analysis.
Independent slow feature analysis and nonlinear blind source separation
 in Proc. Int. Worksh. Independent Component Analysis and Blind Source Separation  ICA, 2004
"... The final version of this article has been published in ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
The final version of this article has been published in
Using kernel PCA for initialisation of variational Bayesian nonlinear blind source separation method
 Proc. of the Fifth Int. Conf. on Independent Component Analysis and Blind Signal Separation (ICA 2004), volume 3195 of Lecture Notes in Computer Science
, 2004
"... Abstract. The variational Bayesian nonlinear blind source separation method introduced by Lappalainen and Honkela in 2000 is initialised with linear principal component analysis (PCA). Because of the multilayer perceptron (MLP) network used to model the nonlinearity, the method is susceptible to loc ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The variational Bayesian nonlinear blind source separation method introduced by Lappalainen and Honkela in 2000 is initialised with linear principal component analysis (PCA). Because of the multilayer perceptron (MLP) network used to model the nonlinearity, the method is susceptible to local minima and therefore sensitive to the initialisation used. As the method is used for nonlinear separation, the linear initialisation may in some cases lead it astray. In this paper we study the use of kernel PCA (KPCA) in the initialisation. KPCA is a rather straightforward generalisation of linear PCA and it is much faster to compute than the variational Bayesian method. The experiments show that it can produce significantly better initialisations than linear PCA. Additionally, the model comparison methods provided by the variational Bayesian framework can be easily applied to compare different kernels. 1
An Information Theoretic Approach to a Novel Nonlinear Independent Component Analysis Paradigm
 In Press On Elsevier Signal Processing Special Issue on Information Theoretic
, 2005
"... component analysis paradigm ..."
(Show Context)
Postnonlinear independent component analysis by variational Bayesian learning
 In Proc. 5th Int. Conf. on Independent Component Analysis and Blind Signal Separation (ICA 2004
, 2004
"... Abstract. Postnonlinear (PNL) independent component analysis (ICA) is a generalisation of ICA where the observations are assumed to have been generated from independent sources by linear mixing followed by componentwise scalar nonlinearities. Most previous PNL ICA algorithms require the postnonli ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Postnonlinear (PNL) independent component analysis (ICA) is a generalisation of ICA where the observations are assumed to have been generated from independent sources by linear mixing followed by componentwise scalar nonlinearities. Most previous PNL ICA algorithms require the postnonlinearities to be invertible functions. In this paper, we present a variational Bayesian approach to PNL ICA that also works for noninvertible postnonlinearities. The method is based on a generative model with multilayer perceptron (MLP) networks to model the postnonlinearities. Preliminary results with a difficult artificial example are encouraging. 1