Results 1  10
of
19
Bayesian Density Estimation and Inference Using Mixtures
 Journal of the American Statistical Association
, 1994
"... We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation, and are exemplified by special cases where data are modelled as a sample from mixtures of normal distributions. Efficien ..."
Abstract

Cited by 398 (17 self)
 Add to MetaCart
We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation, and are exemplified by special cases where data are modelled as a sample from mixtures of normal distributions. Efficient simulation methods are used to approximate various prior, posterior and predictive distributions. This allows for direct inference on a variety of practical issues, including problems of local versus global smoothing, uncertainty about density estimates, assessment of modality, and the inference on the numbers of components. Also, convergence results are established for a general class of normal mixture models. Keywords: Kernel estimation; Mixtures of Dirichlet processes; Multimodality; Normal mixtures; Posterior sampling; Smoothing parameter estimation * Michael D. Escobar is Assistant Professor, Department of Statistics and Department of Preventive Medicine and Biostatistics, University ...
More Aspects of Polya Tree Distributions for Statistical Modelling
 Ann. Statist
, 1994
"... : The definition and elementary properties of Polya tree distributions are reviewed. Two theorems are presented showing that Polya trees can be constructed to concentrate arbitrarily closely about any desired pdf, and that Polya tree priors can put positive mass in every relative entropy neighborhoo ..."
Abstract

Cited by 57 (1 self)
 Add to MetaCart
: The definition and elementary properties of Polya tree distributions are reviewed. Two theorems are presented showing that Polya trees can be constructed to concentrate arbitrarily closely about any desired pdf, and that Polya tree priors can put positive mass in every relative entropy neighborhood of every positive density with finite entropy, thereby satisfying a consistency condition. Such theorems are false for Dirichlet processes. Models are constructed combining partially specified Polya trees with other information like monotonicity or unimodality. It is shown how to compute bounds on posterior expectations over the class of all priors with the given specifications. A numerical example is given. A theorem of Diaconis and Freedman about Dirichlet processes is generalized to Polya trees, allowing Polya trees to be the models for errors in regression problems. Finally, empirical Bayes models using Dirichlet processes are generalized to Polya trees. An example from Berry and Chris...
Generalized weighted Chinese restaurant processes for species sampling mixture models
 Statistica Sinica
, 2003
"... Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conj ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conjunction with Pitman (1995, 1996), we derive characterizations of the posterior distribution in terms of a posterior partition distribution that extend the results of Lo (1984) for the Dirichlet process. These results provide a better understanding of models and have both theoretical and practical applications. To facilitate the use of our models we generalize the work in Brunner, Chan, James and Lo (2001) by extending their weighted Chinese restaurant (WCR) Monte Carlo procedure, an i.i.d. sequential importance sampling (SIS) procedure for approximating posterior mean functionals based on the Dirichlet process, to the case of approximation of mean functionals and additionally their posterior laws in species sampling mixture models. We also discuss collapsed Gibbs sampling, Pólya urn Gibbs sampling and a Pólya urn SIS scheme. Our framework allows for numerous applications, including multiplicative counting process models subject to weighted gamma processes, as well as nonparametric and semiparametric hierarchical models based on the Dirichlet process, its twoparameter extension, the PitmanYor process and finite dimensional Dirichlet priors. Key words and phrases: Dirichlet process, exchangeable partition, finite dimensional Dirichlet prior, twoparameter PoissonDirichlet process, prediction rule, random probability measure, species sampling sequence.
Dirichlet Prior Sieves in Finite Normal Mixtures
 Statistica Sinica
, 2002
"... Abstract: The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
Abstract: The use of a finite dimensional Dirichlet prior in the finite normal mixture model has the effect of acting like a Bayesian method of sieves. Posterior consistency is directly related to the dimension of the sieve and the choice of the Dirichlet parameters in the prior. We find that naive use of the popular uniform Dirichlet prior leads to an inconsistent posterior. However, a simple adjustment to the parameters in the prior induces a random probability measure that approximates the Dirichlet process and yields a posterior that is strongly consistent for the density and weakly consistent for the unknown mixing distribution. The dimension of the resulting sieve can be selected easily in practice and a simple and efficient Gibbs sampler can be used to sample the posterior of the mixing distribution. Key words and phrases: BoseEinstein distribution, Dirichlet process, identification, method of sieves, random probability measure, relative entropy, weak convergence.
Approximate Dirichlet Process Computing in Finite Normal Mixtures: Smoothing and Prior Information
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2000
"... ..."
Bayesian Semiparametric Inference for the Accelerated Failure Time Model
, 1997
"... Bayesian semiparametric inference is considered for a loglinear model. This model consists of a parametric component for the regression coefficients and a nonparametric component for the unknown error distribution. Bayesian analysis is studied for the case of a parametric prior on the regressio ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Bayesian semiparametric inference is considered for a loglinear model. This model consists of a parametric component for the regression coefficients and a nonparametric component for the unknown error distribution. Bayesian analysis is studied for the case of a parametric prior on the regression coefficients and a mixtureofDirichletprocesses prior on the unknown error distribution. A Markov chain Monte Carlo (MCMC) method is developed to compute the features of the posterior distribution. A model selection method for obtaining a more parsimonious set of predictors is studied. The method adds indicator variables to the regression equation. The set of indicator variables represents all the possible subsets to be considered. A MCMC method is developed to search stochastically for the best subset. These procedures are applied to two examples, one with censored data. Key words and phrases: Censored data; Log linear model; Markov chain Monte Carlo algorithm; Metropolis algori...
Bayesian mixture modeling for spatial Poisson process intensities, with applications to extreme value analysis
 Dept
, 2005
"... Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the spa ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the space where the process is defined. The method is based on modeling a density function, defined on this bounded region, that is directly related with the intensity function of the Poisson process. We develop a flexible nonparametric mixture model for this density using a bivariate Beta distribution for the mixture kernel and a Dirichlet process prior for the mixing distribution. Using posterior simulation methods, we obtain full inference for the intensity function and any other functional of the process that might be of interest. We discuss applications to problems where inference for clustering in the spatial point pattern is of interest. Moreover, we consider applications of the methodology to extreme value analysis problems. We illustrate the modeling approach with three previously published data sets. Two of the data sets are from forestry and consist of locations of trees. The third data set consists of extremes from the Dow Jones index over a period of 1303 days.
On Nonparametric Bayesian Inference for the Distribution of a Random Sample
, 1995
"... The nonparametric Bayesian approach for inference regarding the unknown distribution of a random sample customarily assumes that this distribution is random and arises through Dirichlet process mixing. Previous work within this setting has focused on the mean of the posterior distribution of this ra ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
The nonparametric Bayesian approach for inference regarding the unknown distribution of a random sample customarily assumes that this distribution is random and arises through Dirichlet process mixing. Previous work within this setting has focused on the mean of the posterior distribution of this random distribution which is the predictive distribution of a future observation given the sample. Our interest here is in learning about other features of this posterior distribution as well as about posteriors associated with functionals of the distribution of the data. We indicate how to do this in the case of linear functionals. An illustration, with a sample from a Gamma distribution, utilizes Dirichlet process mixtures of normals to recover this distribution and its features. Key Words: Dirichlet process, linear functional, posterior distribution, predictive distribution, samplingbased inference. AMS Subject Classifications: 62C10, 62G07. 1 Alan E. Gelfand is Professor and Saurabh Mukh...
Approaches for Semiparametric Bayesian Regression
 Computational Approach for Full Nonparametric Bayesian Inference under Dirichlet Process Mixture Models," Journal of Computational and Graphical Statistics
, 1997
"... Developing regression relationships is a primary inferential activity. We consider such relationships in the context of hierarchical models incorporating linear structure at each stage. Modern statistical work encourages less presumptive, i.e., nonparametric specifications for at least a portion of ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Developing regression relationships is a primary inferential activity. We consider such relationships in the context of hierarchical models incorporating linear structure at each stage. Modern statistical work encourages less presumptive, i.e., nonparametric specifications for at least a portion of the modeling. That is, we seek to enrich the class of standard parametric hierarchical models by wandering nonparametrically near (in some sense) the standard class but retaining the linear structure. This enterprise falls within what is referred to as semiparametric modeling. We focus here on nonparametric modeling of monotone functions associated with the model. Such monotone functions arise, for example, as the stochastic mechanism itself using the cumulative distribution function, as the link function in a generalized linear model, as the cumulative hazard function in survival analysis models, and as the calibration function in errorsinvariables models. Nonparametric approaches for mod...
Bayesian Nonparametric Mixture Modeling
, 1993
"... This dissertation explores a Bayesian nonparametric approach to mixture modeling and the use of the Gibbs sampling scheme to approximate posterior estimates. The predictive distribution is modeled as a mixture of normal distributions by using a Dirichlet process prior for the unknown means and vari ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This dissertation explores a Bayesian nonparametric approach to mixture modeling and the use of the Gibbs sampling scheme to approximate posterior estimates. The predictive distribution is modeled as a mixture of normal distributions by using a Dirichlet process prior for the unknown means and variances. The definition and some properties of mixtures of Dirichlet processes are reviewed. Analytically evaluating the predictive distribution is very tedious and difficult in this case. An approximation based on Monte Carlo integration is pro...