Results 1 
3 of
3
Bayesian Compressive Sensing
, 2007
"... The data of interest are assumed to be represented as Ndimensional real vectors, and these vectors are compressible in some linear basis B, implying that the signal can be reconstructed accurately using only a small number M ≪ N of basisfunction coefficients associated with B. Compressive sensing ..."
Abstract

Cited by 327 (24 self)
 Add to MetaCart
The data of interest are assumed to be represented as Ndimensional real vectors, and these vectors are compressible in some linear basis B, implying that the signal can be reconstructed accurately using only a small number M ≪ N of basisfunction coefficients associated with B. Compressive sensing is a framework whereby one does not measure one of the aforementioned Ndimensional signals directly, but rather a set of related measurements, with the new measurements a linear combination of the original underlying Ndimensional signal. The number of required compressivesensing measurements is typically much smaller than N, offering the potential to simplify the sensing system. Let f denote the unknown underlying Ndimensional signal, and g a vector of compressivesensing measurements, then one may approximate f accurately by utilizing knowledge of the (underdetermined) linear relationship between f and g, in addition to knowledge of the fact that f is compressible in B. In this paper we employ a Bayesian formalism for estimating the underlying signal f based on compressivesensing measurements g. The proposed framework has the following properties: (i) in addition to estimating the underlying signal f, “error bars ” are also estimated, these giving a measure of confidence in the inverted signal; (ii) using knowledge of the error bars, a principled means is provided for determining when a sufficient
Fast Bayesian Inference in Dirichlet Process Mixture Models
, 2008
"... There has been increasing interest in applying Bayesian nonparametric methods in large samples and high dimensions. As Markov chain Monte Carlo (MCMC) algorithms are often infeasible, there is a pressing need for much faster algorithms. This article proposes a fast approach for inference in Dirichle ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
There has been increasing interest in applying Bayesian nonparametric methods in large samples and high dimensions. As Markov chain Monte Carlo (MCMC) algorithms are often infeasible, there is a pressing need for much faster algorithms. This article proposes a fast approach for inference in Dirichlet process mixture (DPM) models. Viewing the partitioning of subjects into clusters as a model selection problem, we propose a sequential greedy search algorithm for selecting the partition. Then, when conjugate priors are chosen, the resulting posterior conditionally on the selected partition is available in closed form. This approach allows testing of parametric models versus nonparametric alternatives based on Bayes factors. We evaluate the approach using simulation studies and compare it with four other fast nonparametric methods in the literature. We apply the proposed approach to three datasets including one from a large epidemiologic study. Matlab codes for the simulation and data analyses using the proposed approach are available online in the supplemental materials.
Spatial quantile multiple regression using the asymmetric laplace process
 Bayesian Analysis
, 2012
"... Abstract. We consider quantile multiple regression through conditional quantile models, i.e. each quantile is modeled separately. We work in the context of spatially referenced data and extend the asymmetric Laplace model for quantile regression to a spatial process, the asymmetric Laplace process ( ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. We consider quantile multiple regression through conditional quantile models, i.e. each quantile is modeled separately. We work in the context of spatially referenced data and extend the asymmetric Laplace model for quantile regression to a spatial process, the asymmetric Laplace process (ALP) for quantile regression with spatially dependent errors. By taking advantage of a convenient conditionally Gaussian representation of the asymmetric Laplace distribution, we are able to straightforwardly incorporate spatial dependence in this process. We develop the properties of this process under several specifications, each of which induces different smoothness and covariance behavior at the extreme quantiles. We demonstrate the advantages that may be gained by incorporating spatial dependence into this conditional quantile model by applying it to a data set of log selling prices of homes in Baton Rouge, LA, given characteristics of each house. We also introduce the asymmetric Laplace predictive process (ALPP) which accommodates large data sets, and apply it to a data set of birth weights given maternal covariates for several thousand births in North Carolina in 2000. By modeling the spatial structure in the data, we are able to show, using a check loss function, improved performance on each of the data sets for each of the quantiles at which the model was fit.