Results 1  10
of
13
Generalized weighted Chinese restaurant processes for species sampling mixture models
 Statistica Sinica
, 2003
"... Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conj ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conjunction with Pitman (1995, 1996), we derive characterizations of the posterior distribution in terms of a posterior partition distribution that extend the results of Lo (1984) for the Dirichlet process. These results provide a better understanding of models and have both theoretical and practical applications. To facilitate the use of our models we generalize the work in Brunner, Chan, James and Lo (2001) by extending their weighted Chinese restaurant (WCR) Monte Carlo procedure, an i.i.d. sequential importance sampling (SIS) procedure for approximating posterior mean functionals based on the Dirichlet process, to the case of approximation of mean functionals and additionally their posterior laws in species sampling mixture models. We also discuss collapsed Gibbs sampling, Pólya urn Gibbs sampling and a Pólya urn SIS scheme. Our framework allows for numerous applications, including multiplicative counting process models subject to weighted gamma processes, as well as nonparametric and semiparametric hierarchical models based on the Dirichlet process, its twoparameter extension, the PitmanYor process and finite dimensional Dirichlet priors. Key words and phrases: Dirichlet process, exchangeable partition, finite dimensional Dirichlet prior, twoparameter PoissonDirichlet process, prediction rule, random probability measure, species sampling sequence.
Poisson process partition calculus with an application to Bayesian . . .
, 2005
"... This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The P ..."
Abstract

Cited by 32 (10 self)
 Add to MetaCart
This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The Poisson disintegration method is based on the formal statement of two results concerning a Laplace functional change of measure and a Poisson Palm/Fubini calculus in terms of random partitions of the integers {1,...,n}. The techniques are analogous to, but much more general than, techniques for the Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2001
"... ..."
Some Further Developments for StickBreaking Priors: Finite and Infinite Clustering and Classification
 Sankhya Series A
, 2003
"... this paper will be to develop new surrounding theory for the hierarchical model (7) and show how these may be used to develop computational algorithms for computing posterior quantities. Our theoretical contributions include developing key properties for the class of extended stickbreaking measures ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
this paper will be to develop new surrounding theory for the hierarchical model (7) and show how these may be used to develop computational algorithms for computing posterior quantities. Our theoretical contributions include developing key properties for the class of extended stickbreaking measures, which includes establishing a conjugacy property of their random weights to i.i.d sampling, and a characterization of the posterior for the extended stickbreaking prior under i.i.d sampling. See Section 3. These properties then lead us in Section 4 to a general characterization for the posterior of (7). In Section 5 we outline a collapsed Gibbs sampling algorithm and an i.i.d SIS (sequential importance sampling) algorithm that can be used for inference in (7). One important implication is our ability to t the posterior of (6) subject to in nite dimensional stickbreaking measures. The paper begins with a brief discussion of stickbreaking priors in Section 2
Independent and Identically Distributed Monte Carlo Algorithms for Semiparametric Linear Mixed Models
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2002
"... ..."
Bayesian model based clustering procedures
 Journal of Computational and Graphical Statistics Lo
, 2006
"... This paper establishes a general framework for Bayesian modelbased clustering, in which subset labels are exchangeable, and items are also exchangeable, possibly up to covariate effects. It is rich enough to encompass a variety of existing procedures, including some recently discussed methodologies ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper establishes a general framework for Bayesian modelbased clustering, in which subset labels are exchangeable, and items are also exchangeable, possibly up to covariate effects. It is rich enough to encompass a variety of existing procedures, including some recently discussed methodologies involving stochastic search or hierarchical clustering, but more importantly allows the formulation of clustering procedures that are optimal with respect to a specified loss function. Our focus is on loss functions based on pairwise coincidences, that is, whether pairs of items are clustered into the same subset or not. Optimisation of the posterior expected loss function can be formulated as a binary integer programming problem, which can be readily solved, for example by the simplex method, when clustering a modest number of items, but quickly becomes impractical as problem scale increases. To combat this, a new heuristic itemswapping algorithm is introduced. This performs well in our numerical experiments, on both simulated and real data examples. The paper includes a comparison of the statistical performance of the (approximate) optimal clustering with earlier methods that are modelbased but ad hoc in their detailed definition.
A predictive view of Bayesian clustering
 J. Statist. Planning and Inference
, 2006
"... This work considers probability models for partitions of a set of n elements using a predictive approach, i.e., models that are specified in terms of the conditional probability of either joining an already existing cluster or forming a new one. The inherent structure can be motivated by resorting t ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
This work considers probability models for partitions of a set of n elements using a predictive approach, i.e., models that are specified in terms of the conditional probability of either joining an already existing cluster or forming a new one. The inherent structure can be motivated by resorting to hierarchical models of either parametric or nonparametric nature. Parametric examples include the product partition models (PPMs) and the modelbased approach of Dasgupta and Raftery (1998), while nonparametric alternatives include the Dirichlet Process, and more generally, the Species Sampling Models (SSMs). Under exchangeability, PPMs and SSMs induce the same type of partition structure. The methods are discussed in the context of outlier detection in normal linear regression models and of (univariate) density estimation.
A Bayes method for a monotone hazard rate via Spaths
 Ann. Statist
, 2006
"... A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tract ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tractable Bayes estimator for the hazard rate is derived to be a finite sum over Spaths. The path characterization or the estimator is proved to be a RaoBlackwellization of an existing partition characterization or partitionsum estimator. This accentuates the importance of Spath in Bayesian modeling of monotone hazard rates. An efficient Markov chain Monte Carlo (MCMC) method is proposed to approximate this class of estimates. It is shown that Spath characterization also exists in modeling with covariates by a proportional hazard model, and the proposed algorithm again applies. Numerical results of the method are given to demonstrate its practicality and effectiveness.
A Class of Generalized Hyperbolic Continuous Time Integrated Stochastic Volatility Likelihood Models
, 2005
"... This paper discusses and analyzes a class of likelihood models which are based on two distributional innovations in financial models for stock returns. That is, the notion that the marginal distribution of aggregate returns of logstock prices are well approximated by generalized hyperbolic distribu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper discusses and analyzes a class of likelihood models which are based on two distributional innovations in financial models for stock returns. That is, the notion that the marginal distribution of aggregate returns of logstock prices are well approximated by generalized hyperbolic distributions, and that volatility clustering can be handled by specifying the integrated volatility as a random process such as that proposed in a recent series of papers by BarndorffNielsen and Shephard (BNS). Indeed, the use of just the integrated OrnsteinUhlenbeck(INTOU) models of BNS serves to handle both features mentioned above. The BNS models produce likelihoods for aggregate returns which can be viewed as a subclass of latent regression models where one has n conditionally independent Normal random variables whose mean and variance are representable as linear functionals of a common unobserved Poisson random measure. James (2005b) recently obtains an exact analysis for such models yielding expressions of the likelihood in terms of quite tractable FourierCosine integrals. Here, our idea is to analyze a class of likelihoods, which can be used for similar purposes, but where the latent regression models are based on n conditionally independent models with distributions belonging to a subclass of the generalized hyperbolic distributions and whose corresponding parameters are representable as linear functionals of a common unobserved Poisson random measure. Our models are perhaps most
Testing Failure Data for Evidence of Aging
"... This paper presents an application of a Bayesian nonparametric method to hypothesis testing between exponential and increasing failure rate (IFR) distributions. The weighted gamma process is selected as a prior on the space of nondecreasing failure rates. Monte Carlo simulations of the weighted Chi ..."
Abstract
 Add to MetaCart
This paper presents an application of a Bayesian nonparametric method to hypothesis testing between exponential and increasing failure rate (IFR) distributions. The weighted gamma process is selected as a prior on the space of nondecreasing failure rates. Monte Carlo simulations of the weighted Chinese restaurant process provide an approximation to the posterior probability of each hypothesis. 1 Introduction