Results 1  10
of
59
Joint Bayesian Endmember Extraction and Linear Unmixing for Hyperspectral Imagery
"... Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown e ..."
Abstract

Cited by 39 (27 self)
 Add to MetaCart
Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown endmember spectra is conducted in a unified manner by generating the posterior distribution of abundances and endmember parameters under a hierarchical Bayesian model. This model assumes conjugate prior distributions for these parameters, accounts for nonnegativity and fulladditivity constraints, and exploits the fact that the endmember proportions lie on a lower dimensional simplex. A Gibbs sampler is proposed to overcome the complexity of evaluating the resulting posterior distribution. This sampler generates samples distributed according to the posterior distribution and estimates the unknown parameters using these generated samples. The accuracy of the joint Bayesian estimator is illustrated by simulations conducted on synthetic and real AVIRIS images. Index Terms—Bayesian inference, endmember extraction, hyperspectral imagery, linear spectral unmixing, MCMC methods. I.
Modeling changing dependency structure in multivariate time series
 In International Conference in Machine Learning
, 2007
"... We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmenta ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmentation, as well as how to draw perfect samples from the posterior over segmentations, simultaneously accounting for uncertainty about the number and location of changepoints, as well as uncertainty about the covariance structure. We illustrate the technique by applying it to financial data and to bee tracking data. 1.
Semisupervised linear spectral unmixing using a hierarchical Bayesian model for hyperspectral imagery,” IRIT/ENSEEIHT/TeSA
, 2007
"... Abstract—This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters a ..."
Abstract

Cited by 31 (21 self)
 Add to MetaCart
Abstract—This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters appearing in this model satisfy positivity and additivity constraints. These constraints are naturally expressed in a Bayesian context by using appropriate abundance prior distributions. The posterior distributions of the unknown model parameters are then derived. A Gibbs sampler allows one to draw samples distributed according to the posteriors of interest and to estimate the unknown abundances. An extension of the algorithm is finally studied for mixtures with unknown numbers of spectral components belonging to a know library. The performance of the different unmixing strategies is evaluated via simulations conducted on synthetic and real data. Index Terms—Gibbs sampler, hierarchical Bayesian analysis, hyperspectral images, linear spectral unmixing, Markov chain Monte Carlo (MCMC) methods, reversible jumps. I.
Structural break estimation for nonstationary time series models
 J. Amer. Statist. Assoc
, 2005
"... In this work we consider the problem of modeling a class of nonstationary time series signals using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown. The minimum ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
In this work we consider the problem of modeling a class of nonstationary time series signals using piecewise autoregressive (AR) processes. The number and locations of the piecewise autoregressive segments, as well as the orders of the respective AR processes, are assumed to be unknown. The minimum description length principle is applied to find the “best ” combination of the number of the segments, the lengths of the segments, and the orders of the piecewise AR processes. A genetic algorithm is implemented to solve this difficult optimization problem. We term the resulting procedure AutoPARM. Numerical results from both simulation experiments and real data analysis show that AutoPARM enjoys excellent empirical properties. Consistency of AutoPARM for break point estimation can also be shown. KEY WORDS: Nonstationarity, change points, minimum description length principle, genetic algorithm
Joint segmentation of piecewise constant autoregressive processes by using a hierarchical model and a Bayesian sampling approach
 IEEE Transactions on Signal Processing
, 2007
"... We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems ..."
Abstract

Cited by 28 (16 self)
 Add to MetaCart
We propose a joint segmentation algorithm for piecewise constant AR processes recorded by several independent sensors. The algorithm is based on a hierarchical Bayesian model. Appropriate priors allow to introduce correlations between the change locations of the observed signals. Numerical problems inherent to Bayesian inference are solved by a Gibbs sampling strategy. The proposed joint segmentation methodology provides interesting results compared to a signalbysignal segmentation. 1.
Exact Bayesian curve fitting and signal segmentation
 IEEE Trans. Signal Process
, 2005
"... Abstract—We consider regression models where the underlying functional relationship between the response and the explanatory variable is modeled as independent linear regressions on disjoint segments. We present an algorithm for perfect simulation from the posterior distribution of such a model, eve ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
Abstract—We consider regression models where the underlying functional relationship between the response and the explanatory variable is modeled as independent linear regressions on disjoint segments. We present an algorithm for perfect simulation from the posterior distribution of such a model, even allowing for an unknown number of segments and an unknown model order for the linear regressions within each segment. The algorithm is simple, can scale well to large data sets, and avoids the problem of diagnosing convergence that is present with Monte Carlo Markov Chain (MCMC) approaches to this problem. We demonstrate our algorithm on standard denoising problems, on a piecewise constant AR model, and on a speech segmentation problem. Index Terms—Changepoints, denoising, forwardbackward algorithm, linear regression, model uncertainty, perfect simulation. I.
Supervised nonlinear spectral unmixing using a postnonlinear mixing model for hyperspectral images,” Univ
, 2011
"... Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polyno ..."
Abstract

Cited by 17 (13 self)
 Add to MetaCart
Abstract—This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polynomial functions leading to a polynomial postnonlinear mixing model. A Bayesian algorithm and optimization methods are proposed to estimate the parameters involved in the model.Theperformanceoftheunmixing strategies is evaluated by simulations conducted on synthetic and real data. Index Terms—Hyperspectral imagery, postnonlinear model, spectral unmixing (SU). I.
Joint segmentation of multivariate astronomical time series: Bayesian sampling with a hierarchical model
 IEEE Trans. Signal Process
"... Abstract—Astronomy and other sciences often face the problem of detecting and characterizing structure in two or more related time series. This paper approaches such problems using Bayesian priors to represent relationships between signals with various degrees of certainty, and not just rigid constr ..."
Abstract

Cited by 16 (9 self)
 Add to MetaCart
Abstract—Astronomy and other sciences often face the problem of detecting and characterizing structure in two or more related time series. This paper approaches such problems using Bayesian priors to represent relationships between signals with various degrees of certainty, and not just rigid constraints. The segmentation is conducted by using a hierarchical Bayesian approach to a piecewise constant Poisson rate model. A Gibbs sampling strategy allows joint estimation of the unknown parameters and hyperparameters. Results obtained with synthetic and real photon counting data illustrate the performance of the proposed algorithm. Index Terms—Gibbs sampling, hierarchical Bayesian analysis, Markov chain Monte Carlo, photon counting data, segmentation. I.
Hierarchical Bayesian Sparse Image Reconstruction With Application to MRFM
"... Abstract—This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seam ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
Abstract—This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g., by maximizing the estimated posterior distribution. In our fully Bayesian approach, the posteriors of all the parameters are available. Thus, our algorithm provides more information than other previously proposed sparse reconstruction methods that only give a point estimate. The performance of the proposed hierarchical Bayesian sparse reconstruction method is illustrated on synthetic data and real data collected from a tobacco virus sample using a prototype MRFM instrument. Index Terms—Bayesian inference, deconvolution, Markov chain Monte Carlo (MCMC) methods, magnetic resonance force microscopy