Results 1  10
of
74
Joint Bayesian Endmember Extraction and Linear Unmixing for Hyperspectral Imagery
"... Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown e ..."
Abstract

Cited by 75 (37 self)
 Add to MetaCart
(Show Context)
Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown endmember spectra is conducted in a unified manner by generating the posterior distribution of abundances and endmember parameters under a hierarchical Bayesian model. This model assumes conjugate prior distributions for these parameters, accounts for nonnegativity and fulladditivity constraints, and exploits the fact that the endmember proportions lie on a lower dimensional simplex. A Gibbs sampler is proposed to overcome the complexity of evaluating the resulting posterior distribution. This sampler generates samples distributed according to the posterior distribution and estimates the unknown parameters using these generated samples. The accuracy of the joint Bayesian estimator is illustrated by simulations conducted on synthetic and real AVIRIS images. Index Terms—Bayesian inference, endmember extraction, hyperspectral imagery, linear spectral unmixing, MCMC methods. I.
Structural break estimation for nonstationary time series models
 J. Amer. Statist. Assoc
, 2005
"... In this work we consider the problem of modeling a class of nonstationary time series signals using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown. The minimum ..."
Abstract

Cited by 54 (1 self)
 Add to MetaCart
In this work we consider the problem of modeling a class of nonstationary time series signals using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown. The minimum description length principle is applied to find the “best ” combination of the number of the segments, the lengths of the segments, and the orders of the piecewise AR processes. A genetic algorithm is implemented to solve this difficult optimization problem. We term the resulting procedure AutoPARM. Numerical results from both simulation experiments and real data analysis show that AutoPARM enjoys excellent empirical properties. Consistency of AutoPARM for break point estimation can also be shown. KEY WORDS: Nonstationarity, change points, minimum description length principle, genetic algorithm
Semisupervised linear spectral unmixing using a hierarchical Bayesian model for hyperspectral imagery,” IRIT/ENSEEIHT/TeSA
, 2007
"... Abstract—This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters a ..."
Abstract

Cited by 49 (28 self)
 Add to MetaCart
(Show Context)
Abstract—This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters appearing in this model satisfy positivity and additivity constraints. These constraints are naturally expressed in a Bayesian context by using appropriate abundance prior distributions. The posterior distributions of the unknown model parameters are then derived. A Gibbs sampler allows one to draw samples distributed according to the posteriors of interest and to estimate the unknown abundances. An extension of the algorithm is finally studied for mixtures with unknown numbers of spectral components belonging to a know library. The performance of the different unmixing strategies is evaluated via simulations conducted on synthetic and real data. Index Terms—Gibbs sampler, hierarchical Bayesian analysis, hyperspectral images, linear spectral unmixing, Markov chain Monte Carlo (MCMC) methods, reversible jumps. I.
Modeling changing dependency structure in multivariate time series
 In International Conference in Machine Learning
, 2007
"... We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmenta ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
(Show Context)
We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmentation, as well as how to draw perfect samples from the posterior over segmentations, simultaneously accounting for uncertainty about the number and location of changepoints, as well as uncertainty about the covariance structure. We illustrate the technique by applying it to financial data and to bee tracking data. 1.
Joint segmentation of piecewise constant autoregressive processes by using a hierarchical model and a Bayesian sampling approach
 IEEE Transactions on Signal Processing
, 2007
"... We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems ..."
Abstract

Cited by 36 (21 self)
 Add to MetaCart
(Show Context)
We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems inherent to Bayesian inference are solved by a Gibbs sampling strategy. The proposed joint segmentation methodology provides interesting results compared to a signalbysignal segmentation. 1.
Supervised nonlinear spectral unmixing using a postnonlinear mixing model for hyperspectral images,” Univ
, 2011
"... Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polyno ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polynomial functions leading to a polynomial postnonlinear mixing model. A Bayesian algorithm and optimization methods are proposed to estimate the parameters involved in the model.Theperformanceoftheunmixing strategies is evaluated by simulations conducted on synthetic and real data. Index Terms—Hyperspectral imagery, postnonlinear model, spectral unmixing (SU). I.
Exact Bayesian curve fitting and signal segmentation
 IEEE Trans. Signal Process
, 2005
"... Abstract—We consider regression models where the underlying functional relationship between the response and the explanatory variable is modeled as independent linear regressions on disjoint segments. We present an algorithm for perfect simulation from the posterior distribution of such a model, eve ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We consider regression models where the underlying functional relationship between the response and the explanatory variable is modeled as independent linear regressions on disjoint segments. We present an algorithm for perfect simulation from the posterior distribution of such a model, even allowing for an unknown number of segments and an unknown model order for the linear regressions within each segment. The algorithm is simple, can scale well to large data sets, and avoids the problem of diagnosing convergence that is present with Monte Carlo Markov Chain (MCMC) approaches to this problem. We demonstrate our algorithm on standard denoising problems, on a piecewise constant AR model, and on a speech segmentation problem. Index Terms—Changepoints, denoising, forwardbackward algorithm, linear regression, model uncertainty, perfect simulation. I.
Hierarchical Bayesian Sparse Image Reconstruction With Application to MRFM
"... Abstract—This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seam ..."
Abstract

Cited by 29 (11 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g., by maximizing the estimated posterior distribution. In our fully Bayesian approach, the posteriors of all the parameters are available. Thus, our algorithm provides more information than other previously proposed sparse reconstruction methods that only give a point estimate. The performance of the proposed hierarchical Bayesian sparse reconstruction method is illustrated on synthetic data and real data collected from a tobacco virus sample using a prototype MRFM instrument. Index Terms—Bayesian inference, deconvolution, Markov chain Monte Carlo (MCMC) methods, magnetic resonance force microscopy
Bayesian estimation of linear mixtures using the normal compositional model
 IEEE Trans. Image Processing
, 2010
"... Abstract—This paper studies a new Bayesian unmixing algorithm for hyperspectral images. Each pixel of the image is modeled as a linear combination of socalled endmembers. These endmembers are supposed to be random in order to model uncertainties regarding their knowledge. More precisely, we model e ..."
Abstract

Cited by 26 (17 self)
 Add to MetaCart
Abstract—This paper studies a new Bayesian unmixing algorithm for hyperspectral images. Each pixel of the image is modeled as a linear combination of socalled endmembers. These endmembers are supposed to be random in order to model uncertainties regarding their knowledge. More precisely, we model endmembers as Gaussian vectors whose means have been determined using an endmember extraction algorithm such as the famous Nfinder (NFINDR) or Vertex Component Analysis (VCA) algorithms. This paper proposes to estimate the mixture coefficients (referred to as abundances) using a Bayesian algorithm. Suitable priors are assigned to the abundances in order to satisfy positivity and additivity constraints whereas conjugate priors are chosen for the remaining parameters. A hybrid Gibbs sampler is then constructed to generate abundance and variance samples distributed according to the joint posterior of the abundances and noise variances. The performance of the proposed methodology is evaluated by comparison with other unmixing algorithms on synthetic and real images. Index Terms—Bayesian inference, hyperspectral images, Monte Carlo methods, normal compositional model, spectral unmixing.