Results 1  10
of
48
Gibbs Sampling Methods for StickBreaking Priors
"... ... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling meth ..."
Abstract

Cited by 304 (17 self)
 Add to MetaCart
... In this paper we present two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stickbreaking priors. The first type of Gibbs sampler, referred to as a Polya urn Gibbs sampler, is a generalized version of a widely used Gibbs sampling method currently employed for Dirichlet process computing. This method applies to stickbreaking priors with a known P'olya urn characterization; that is priors with an explicit and simple prediction rule. Our second method, the blocked Gibbs sampler, is based on a entirely different approach that works by directly sampling values from the posterior of the random measure. The blocked Gibbs sampler can be viewed as a more general approach as it works without requiring an explicit prediction rule. We find that the blocked Gibbs avoids some of the limitations seen with the Polya urn approach and should be simpler for nonexperts to use.
Series representations of Lévy processes from the perspective of point processes
"... Several methods of generating series representations of a Levy process are presented under a unified approach and a new rejection method is introduced in this context. The connection of such representations with the LevyIto integral representation is precisely established. Four series representati ..."
Abstract

Cited by 65 (9 self)
 Add to MetaCart
Several methods of generating series representations of a Levy process are presented under a unified approach and a new rejection method is introduced in this context. The connection of such representations with the LevyIto integral representation is precisely established. Four series representations of a gamma process are given as illustrations of these methods. 1 From L evyIt o to series representations. Introduction. Let fX(t) : t 2 [0; 1]g be a Levy process in R d with the characteristic function given by E exp(iuX(t)) = exp t[iua + Z R d 0 (e iux 1 iuxI(jxj 1)) Q(dx)] (1.1) where a 2 R d and Q is a Levy measure on R d 0 (R d 0 := R d n f0g). Assume that the paths of X are rightcontinuous and have lefthand limits (abbreviated as rcll). By the LevyIto integral representation, a.s. for each t 0, X(t) = ta + Z jxj1 x [(N([0; t]; dx) tQ(dx)] + Z jxj>1 xN([0; t]; dx) (1.2) where N is the process of jumps of X : N(A) = P ft: X(t)6=0g 1f(t; X(t))...
Poisson/gamma random field models for spatial statistics
 BIOMETRIKA
, 1998
"... Doubly stochastic Bayesian hierarchical models are introduced to account for uncertainty and spatial variation in the underlying intensity measure for point process models. Inhomogeneous gamma process random fields and, more generally, Markov random fields with infinitely divisible distributions are ..."
Abstract

Cited by 61 (14 self)
 Add to MetaCart
Doubly stochastic Bayesian hierarchical models are introduced to account for uncertainty and spatial variation in the underlying intensity measure for point process models. Inhomogeneous gamma process random fields and, more generally, Markov random fields with infinitely divisible distributions are used to construct positively autocorrelated intensity measures for spatial Poisson point processes; these in turn are used to model the number and location of individual events. A data augmentation scheme and Markov chain Monte Carlo numerical methods are employed to generate samples from Bayesian posterior and predictive distributions. The methods are developed in both continuous and discrete settings, and are applied to a problem in forest ecology.
Poisson process partition calculus with an application to Bayesian . . .
, 2005
"... This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The P ..."
Abstract

Cited by 52 (13 self)
 Add to MetaCart
This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The Poisson disintegration method is based on the formal statement of two results concerning a Laplace functional change of measure and a Poisson Palm/Fubini calculus in terms of random partitions of the integers {1,...,n}. The techniques are analogous to, but much more general than, techniques for the Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
Nonparametric adaptive estimation for pure jump Lévy processes
, 2008
"... Abstract. This paper is concerned with nonparametric estimation of the Lévy density of a pure jump Lévy process. The sample path is observed at n discrete instants with fixed sampling interval. We construct a collection of estimators obtained by deconvolution methods and deduced from appropriate est ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
Abstract. This paper is concerned with nonparametric estimation of the Lévy density of a pure jump Lévy process. The sample path is observed at n discrete instants with fixed sampling interval. We construct a collection of estimators obtained by deconvolution methods and deduced from appropriate estimators of the characteristic function and its first derivative. We obtain a bound for the L 2risk, under general assumptions on the model. Then we propose a penalty function that allows to build an adaptive estimator. The risk bound for the adaptive estimator is obtained under additional assumptions on the Lévy density. Examples of models fitting in our framework are described and rates of convergence of the estimator are discussed. June 20, 2008
Simulation methods for Levydriven CARMA stochastic volatility models
 Journal of Business and Economic Statistics
, 2006
"... We develop simulation schemes for the new classes of nonGaussian pure jump L¶evy processes for stochastic volatility. We write the price and volatility processes as integrals against a vector L¶evy process, which then makes series approximation methods directly applicable. These methods entail simu ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
We develop simulation schemes for the new classes of nonGaussian pure jump L¶evy processes for stochastic volatility. We write the price and volatility processes as integrals against a vector L¶evy process, which then makes series approximation methods directly applicable. These methods entail simulation of the L¶evy increments and formation of weighted sums of the increments; they do not require a closedform expression for a tail mass function nor speci¯cation of a copula function. We also present a new, and apparently quite °exible, bivariate mixture of gammas model for the driving L¶evy process. Within this setup, it is quite straightforward to generate simulations from a L¶evydriven CARMA stochastic volatility model augmented by a purejump price component. Simulations reveal the wide range of di®erent types of ¯nancial price processes that can be generated in this manner, including processes with persistent stochastic volatility, dynamic leverage, and jumps.
Normalized random measures driven by increasing additive processes
 Annals of Statistics
"... This paper introduces and studies a new class of nonparametric prior distributions. Random probability distribution functions are constructed via normalization of random measures driven by increasing additive processes. In particular, we present results for the distribution of means under both prior ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
This paper introduces and studies a new class of nonparametric prior distributions. Random probability distribution functions are constructed via normalization of random measures driven by increasing additive processes. In particular, we present results for the distribution of means under both prior and posterior conditions and, via the use of strategic latent variables, undertake a full Bayesian analysis. Our class of priors includes the wellknown and widely used mixture of a Dirichlet process.
ON THE MARKOV–KREIN IDENTITY AND QUASIINVARIANCE OF THE GAMMA PROCESS
, 2004
"... ..."
(Show Context)
MCMC for normalized random measure mixture models Statistical Science 28
, 2013
"... Abstract. This paper concerns the use of Markov chain Monte Carlo methods for posterior sampling in Bayesian nonparametric mixture models with normalized random measure priors. Making use of some recent posterior characterizations for the class of normalized random measures, we propose novel Markov ..."
Abstract

Cited by 17 (8 self)
 Add to MetaCart
(Show Context)
Abstract. This paper concerns the use of Markov chain Monte Carlo methods for posterior sampling in Bayesian nonparametric mixture models with normalized random measure priors. Making use of some recent posterior characterizations for the class of normalized random measures, we propose novel Markov chain Monte Carlo methods of both marginal type and conditional type. The proposed marginal samplers are generalizations of Neal’s wellregarded Algorithm 8 for Dirichlet process mixture models, whereas the conditional sampler is a variation of those recently introduced in the literature. For both the marginal and conditional methods, we consider as a running example a mixture model with an underlying normalized generalized Gamma process prior, and describe comparative simulation results demonstrating the efficacies of the proposed methods. Key words and phrases: Bayesian nonparametrics, hierarchical mixture model, completely random measure, normalized random measure, Dirichlet process, normalized generalized Gamma process, MCMC posterior sampling method, marginalized sampler, Algorithm 8, conditional sampler, slice sampling. 1.