Results 1  10
of
53
The Quickhull algorithm for convex hulls
 ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE
, 1996
"... The convex hull of a set of points is the smallest convex set that contains the points. This article presents a practical convex hull algorithm that combines the twodimensional Quickhull Algorithm with the generaldimension BeneathBeyond Algorithm. It is similar to the randomized, incremental algo ..."
Abstract

Cited by 456 (0 self)
 Add to MetaCart
The convex hull of a set of points is the smallest convex set that contains the points. This article presents a practical convex hull algorithm that combines the twodimensional Quickhull Algorithm with the generaldimension BeneathBeyond Algorithm. It is similar to the randomized, incremental algorithms for convex hull and Delaunay triangulation. We provide empirical evidence that the algorithm runs faster when the input contains nonextreme points and that it uses less memory. Computational geometry algorithms have traditionally assumed that input sets are well behaved. When an algorithm is implemented with floatingpoint arithmetic, this assumption can lead to serious errors. We briefly describe a solution to this problem when computing the convex hull in two, three, or four dimensions. The output is a set of “thick ” facets that contain all possible exact convex hulls of the input. A variation is effective in five or more dimensions.
Vertex component analysis: A fast algorithm to unmix hyperspectral data
 IEEE Transactions on Geoscience and Remote Sensing
, 2005
"... Abstract—Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new me ..."
Abstract

Cited by 84 (10 self)
 Add to MetaCart
Abstract—Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: 1) the endmembers are the vertices of a simplex and 2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with stateoftheart methods, with a computational complexity between one and two orders of magnitude lower than the best available method. Index Terms—Linear unmixing, simplex, spectral mixture model, unmixing hypespectral data, unsupervised endmember extraction, vertex component analysis (VCA). I.
Sparse Elimination and Applications in Kinematics
, 1994
"... This thesis proposes efficient algorithmic solutions to problems in computational algebra and computational algebraic geometry. Moreover, it considers their application to different areas where algebraic systems describe kinematic and geometric constraints. Given an arbitrary system of nonlinear mul ..."
Abstract

Cited by 49 (11 self)
 Add to MetaCart
This thesis proposes efficient algorithmic solutions to problems in computational algebra and computational algebraic geometry. Moreover, it considers their application to different areas where algebraic systems describe kinematic and geometric constraints. Given an arbitrary system of nonlinear multivariate polynomial equations, its resultant serves in eliminating variables and reduces root finding to a linear eigenproblem. Our contribution is to describe the first efficient and general algorithms for computing the sparse resultant. The sparse resultant generalizes the classical homogeneous resultant and exploits the structure of the given polynomials. Its size depends only on the geometry of the input Newton polytopes. The first algorithm uses a subdivision of the Minkowski sum and produces matrix...
Does independent component analysis play a role in unmixing hyperspectral data
 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
, 2005
"... Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2) ..."
Abstract

Cited by 37 (10 self)
 Add to MetaCart
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2) sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signaltonoise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Joint Bayesian Endmember Extraction and Linear Unmixing for Hyperspectral Imagery
"... Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown e ..."
Abstract

Cited by 37 (26 self)
 Add to MetaCart
Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown endmember spectra is conducted in a unified manner by generating the posterior distribution of abundances and endmember parameters under a hierarchical Bayesian model. This model assumes conjugate prior distributions for these parameters, accounts for nonnegativity and fulladditivity constraints, and exploits the fact that the endmember proportions lie on a lower dimensional simplex. A Gibbs sampler is proposed to overcome the complexity of evaluating the resulting posterior distribution. This sampler generates samples distributed according to the posterior distribution and estimates the unknown parameters using these generated samples. The accuracy of the joint Bayesian estimator is illustrated by simulations conducted on synthetic and real AVIRIS images. Index Terms—Bayesian inference, endmember extraction, hyperspectral imagery, linear spectral unmixing, MCMC methods. I.
Hyperspectral subspace identification
 IEEE Trans. Geosci. Remote Sens
, 2008
"... Abstract—Signal subspace identification is a crucial first step in many hyperspectral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction, yielding gains in algorithm performanc ..."
Abstract

Cited by 35 (16 self)
 Add to MetaCart
Abstract—Signal subspace identification is a crucial first step in many hyperspectral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction, yielding gains in algorithm performance and complexity and in data storage. This paper introduces a new minimum mean square errorbased approach to infer the signal subspace in hyperspectral imagery. The method, which is termed hyperspectral signal identification by minimum error, is eigen decomposition based, unsupervised, and fully automatic (i.e., it does not depend on any tuning parameters). It first estimates the signal and noise correlation matrices and then selects the subset of eigenvalues that best represents the signal subspace in the least squared error sense. Stateoftheart performance of the proposed method is illustrated by using simulated and real hyperspectral images. Index Terms—Dimensionality reduction, hyperspectral imagery, hyperspectral signal subspace identification by minimum error (HySime), hyperspectral unmixing, linear mixture, minimum mean square error (mse), subspace identification. I.
Nonlinear unmixing of hyperspectral images using a generalized bilinear model
 IEEE Trans. Geosci. and Remote Sensing
"... Nonlinear models have recently shown interesting properties for spectral unmixing. This paper considers a generalized bilinear model recently introduced for unmixing hyperspectral images. Different algorithms are studied to estimate the parameters of this bilinear model. The positivity and sumtoon ..."
Abstract

Cited by 26 (21 self)
 Add to MetaCart
Nonlinear models have recently shown interesting properties for spectral unmixing. This paper considers a generalized bilinear model recently introduced for unmixing hyperspectral images. Different algorithms are studied to estimate the parameters of this bilinear model. The positivity and sumtoone constraints for the abundances are ensured by the proposed algorithms. The performance of the resulting unmixing strategy is evaluated via simulations conducted on synthetic and real data. Index Terms — hyperspectral imagery, spectral unmixing, bilinear model, Bayesian inference, MCMC methods, gradient descent algorithm, least square algorithm. 1.
Hyperspectral Image Processing for Automatic Target Detection Applications
, 2003
"... This article presents an overview of the theoretical and practical issues associated with the development, analysis, and application of detection algorithms to exploit hyperspectral imaging data. We focus on techniques that exploit spectral information exclusively to make decisions regarding the ty ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
This article presents an overview of the theoretical and practical issues associated with the development, analysis, and application of detection algorithms to exploit hyperspectral imaging data. We focus on techniques that exploit spectral information exclusively to make decisions regarding the type of each pixel—target or nontarget—on a pixelbypixel basis in an image. First we describe the fundamental structure of the hyperspectral data and explain how these data influence the signal models used for the development and theoretical analysis of detection algorithms. Next we discuss the approach used to derive detection algorithms, the performance metrics necessary for the evaluation of these algorithms, and a taxonomy that presents the various algorithms in a systematic manner. We derive the basic algorithms in each family, explain how they work, and provide results for their theoretical performance. We conclude with empirical results that use hyperspectral imaging data from the HYDICE and Hyperion sensors to illustrate the operation and performance of various detectors.
MINIMUM VOLUME SIMPLEX ANALYSIS: A FAST ALGORITHM TO UNMIX HYPERSPECTRAL DATA
"... This paper presents a new method of minimum volume class for hyperspectral unmixing, termed minimum volume simplex analysis (MVSA). The underlying mixing model is linear; i.e., the mixed hyperspectral vectors are modeled by a linear mixture of the endmember signatures weighted by the correspondent a ..."
Abstract

Cited by 23 (7 self)
 Add to MetaCart
This paper presents a new method of minimum volume class for hyperspectral unmixing, termed minimum volume simplex analysis (MVSA). The underlying mixing model is linear; i.e., the mixed hyperspectral vectors are modeled by a linear mixture of the endmember signatures weighted by the correspondent abundance fractions. MVSA approaches hyperspectral unmixing by fitting a minimum volume simplex to the hyperspectral data, constraining the abundance fractions to belong to the probability simplex. The resulting optimization problem is solved by implementing a sequence of quadratically constrained subproblems. In a final step, the hard constraint on the abundance fractions is replaced with a hinge type loss function to account for outliers and noise. We illustrate the stateoftheart performance of the MVSA algorithm in unmixing simulated data sets. We are mainly concerning with the realistic scenario in which the pure pixel assumption (i.e., there exists at least one pure pixel per endmember) is not fulfilled. In these conditions, the MVSA yields much better performance than the pure pixel based algorithms. Index Terms — Hyperspectral unmixing, Minimum volume simplex, Source separation.
Ice: a statistical approach to identifying endmembers in hyperspectral images
 IEEE Trans. Geosci. Remote Sensing
"... Abstract—Several of the more important endmemberfinding algorithms for hyperspectral data are discussed and some of their shortcomings highlighted. A new algorithm—iterated constrained endmembers (ICE)—which attempts to address these shortcomings is introduced. An example of its use is given. There ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Abstract—Several of the more important endmemberfinding algorithms for hyperspectral data are discussed and some of their shortcomings highlighted. A new algorithm—iterated constrained endmembers (ICE)—which attempts to address these shortcomings is introduced. An example of its use is given. There is also a discussion of the advantages and disadvantages of normalizing spectra before the application of ICE or other endmemberfinding algorithms. Index Terms—Convex geometry, endmember, hyperspectral, normalization, simplex.