Results 1  10
of
36
Gibbs Sampling Methods for StickBreaking Priors
"... ... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling meth ..."
Abstract

Cited by 384 (18 self)
 Add to MetaCart
... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling method currently employed for Dirichlet process computing. This method applies to stickbreaking priors with a known P'olya urn characterization; that is priors with an explicit and simple prediction rule. Our second method, the blocked Gibbs sampler, is based on a entirely different approach that works by directly sampling values from the posterior of the random measure. The blocked Gibbs sampler can be viewed as a more general approach as it works without requiring an explicit prediction rule. We find that the blocked Gibbs avoids some of the limitations seen with the Polya urn approach and should be simpler for nonexperts to use.
The twoparameter PoissonDirichlet distribution derived from a stable subordinator.
, 1995
"... The twoparameter PoissonDirichlet distribution, denoted pd(ff; `), is a distribution on the set of decreasing positive sequences with sum 1. The usual PoissonDirichlet distribution with a single parameter `, introduced by Kingman, is pd(0; `). Known properties of pd(0; `), including the Markov ..."
Abstract

Cited by 366 (33 self)
 Add to MetaCart
The twoparameter PoissonDirichlet distribution, denoted pd(ff; `), is a distribution on the set of decreasing positive sequences with sum 1. The usual PoissonDirichlet distribution with a single parameter `, introduced by Kingman, is pd(0; `). Known properties of pd(0; `), including the Markov chain description due to VershikShmidtIgnatov, are generalized to the twoparameter case. The sizebiased random permutation of pd(ff; `) is a simple residual allocation model proposed by Engen in the context of species diversity, and rediscovered by Perman and the authors in the study of excursions of Brownian motion and Bessel processes. For 0 ! ff ! 1, pd(ff; 0) is the asymptotic distribution of ranked lengths of excursions of a Markov chain away from a state whose recurrence time distribution is in the domain of attraction of a stable law of index ff. Formulae in this case trace back to work of Darling, Lamperti and Wendel in the 1950's and 60's. The distribution of ranked lengths of e...
Poisson process partition calculus with an application to Bayesian . . .
, 2005
"... This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The P ..."
Abstract

Cited by 56 (14 self)
 Add to MetaCart
This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The Poisson disintegration method is based on the formal statement of two results concerning a Laplace functional change of measure and a Poisson Palm/Fubini calculus in terms of random partitions of the integers {1,...,n}. The techniques are analogous to, but much more general than, techniques for the Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
Modeling individual differences using Dirichlet processes
, 2006
"... We introduce a Bayesian framework for modeling individual differences, in which subjects are assumed to belong to one of a potentially infinite number of groups. In this model, the groups observed in any particular data set are not viewed as a fixed set that fully explains the variation between indi ..."
Abstract

Cited by 49 (22 self)
 Add to MetaCart
We introduce a Bayesian framework for modeling individual differences, in which subjects are assumed to belong to one of a potentially infinite number of groups. In this model, the groups observed in any particular data set are not viewed as a fixed set that fully explains the variation between individuals, but rather as representatives of a latent, arbitrarily rich structure. As more people are seen, and more details about the individual differences are revealed, the number of inferred groups is allowed to grow. We use the Dirichlet process—a distribution widely used in nonparametric Bayesian statistics—to define a prior for the model, allowing us to learn flexible parameter distributions without overfitting the data, or requiring the complex computations typically required for determining the dimensionality of a model. As an initial demonstration of the approach, we present three applications that analyze the individual differences in category learning, choice of publication outlets, and webbrowsing behavior.
Arcsine laws and interval partitions derived from a stable subordinator
 Proc. London Math. Soc
, 1992
"... Le"vy discovered that the fraction of time a standard onedimensional Brownian motion B spends positive before time t has arcsine distribution, both for / a fixed time when B, #0 almost surely, and for / an inverse local time, when B, = 0 almost surely. This identity in distribution is extende ..."
Abstract

Cited by 48 (24 self)
 Add to MetaCart
(Show Context)
Le"vy discovered that the fraction of time a standard onedimensional Brownian motion B spends positive before time t has arcsine distribution, both for / a fixed time when B, #0 almost surely, and for / an inverse local time, when B, = 0 almost surely. This identity in distribution is extended from the fraction of time spent positive to a large collection of functionals derived from the lengths and signs of excursions of B away from 0. Similar identities in distribution are associated with any process whose zero set is the range of a stable subordinator, for instance a Bessel process of dimension d for 1.
Approximate Dirichlet Process Computing in Finite Normal Mixtures: Smoothing and Prior Information
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2000
"... ..."
(Show Context)
Computational Methods for Multiplicative Intensity Models using Weighted Gamma . . .
 PROCESSES: PROPORTIONAL HAZARDS, MARKED POINT PROCESSES AND PANEL COUNT DATA
, 2004
"... We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share ..."
Abstract

Cited by 29 (5 self)
 Add to MetaCart
We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share many common structural features with the posterior distributions of Bayesian hierarchical models using the Dirichlet process. Using this fact, along with an approximation for the weighted gamma process, we show that with some care, one can adapt efficient algorithms used for the Dirichlet process to this setting. We discuss blocked Gibbs sampling procedures and Pólya urn Gibbs samplers. We illustrate our methods with applications to proportional hazard models, Poisson spatial regression models, recurrent events, and panel count data.
PoissonKingman Partitions
 of Lecture NotesMonograph Series
, 2002
"... This paper presents some general formulas for random partitions of a finite set derived by Kingman's model of random sampling from an interval partition generated by subintervals whose lengths are the points of a Poisson point process. These lengths can be also interpreted as the jumps of a sub ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
This paper presents some general formulas for random partitions of a finite set derived by Kingman's model of random sampling from an interval partition generated by subintervals whose lengths are the points of a Poisson point process. These lengths can be also interpreted as the jumps of a subordinator, that is an increasing process with stationary independent increments. Examples include the twoparameter family of PoissonDirichlet models derived from the Poisson process of jumps of a stable subordinator. Applications are made to the random partition generated by the lengths of excursions of a Brownian motion or Brownian bridge conditioned on its local time at zero.
Some Further Developments for StickBreaking Priors: Finite and Infinite Clustering and Classification
 Sankhya Series A
, 2003
"... this paper will be to develop new surrounding theory for the hierarchical model (7) and show how these may be used to develop computational algorithms for computing posterior quantities. Our theoretical contributions include developing key properties for the class of extended stickbreaking measures ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
this paper will be to develop new surrounding theory for the hierarchical model (7) and show how these may be used to develop computational algorithms for computing posterior quantities. Our theoretical contributions include developing key properties for the class of extended stickbreaking measures, which includes establishing a conjugacy property of their random weights to i.i.d sampling, and a characterization of the posterior for the extended stickbreaking prior under i.i.d sampling. See Section 3. These properties then lead us in Section 4 to a general characterization for the posterior of (7). In Section 5 we outline a collapsed Gibbs sampling algorithm and an i.i.d SIS (sequential importance sampling) algorithm that can be used for inference in (7). One important implication is our ability to t the posterior of (6) subject to in nite dimensional stickbreaking measures. The paper begins with a brief discussion of stickbreaking priors in Section 2
Combinatorial clustering and the beta negative binomial process. arXiv preprint arXiv:1111.1802
, 2013
"... Abstract—We develop a Bayesian nonparametric approach to a general family of latent class problems in which individuals can belong simultaneously to multiple classes and where each class can be exhibited multiple times by an individual. We introduce a combinatorial stochastic process known as the ne ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
(Show Context)
Abstract—We develop a Bayesian nonparametric approach to a general family of latent class problems in which individuals can belong simultaneously to multiple classes and where each class can be exhibited multiple times by an individual. We introduce a combinatorial stochastic process known as the negative binomial process (NBP) as an infinitedimensional prior appropriate for such problems. We show that the NBP is conjugate to the beta process, and we characterize the posterior distribution under the betanegative binomial process (BNBP) and hierarchical models based on the BNBP (the HBNBP). We study the asymptotic properties of the BNBP and develop a threeparameter extension of the BNBP that exhibits powerlaw behavior. We derive MCMC algorithms for posterior inference under the HBNBP, and we present experiments using these algorithms in the domains of image segmentation, object recognition, and document analysis.