Results 1  10
of
36
Modelbased Geostatistics
 Applied Statistics
, 1998
"... Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random v ..."
Abstract

Cited by 96 (4 self)
 Add to MetaCart
Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random variables. We describe two spatial applications for which Gaussian distributional assumptions are clearly inappropriate. The first concerns the assessment of residual contamination from nuclear weapons testing on a South Pacific island, in which the sampling method generates spatially indexed Poisson counts conditional on an unobserved spatially varying intensity of radioactivity; we conclude that a coventional geostatistical analysis oversmooths the data and underestimates the spatial extremes of the intensity. The second application provides a description of spatial variation in the risk of campylobacter infections relative to other enteric infections in part of North Lancashire and South C...
Generalized weighted Chinese restaurant processes for species sampling mixture models
 Statistica Sinica
, 2003
"... Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conj ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conjunction with Pitman (1995, 1996), we derive characterizations of the posterior distribution in terms of a posterior partition distribution that extend the results of Lo (1984) for the Dirichlet process. These results provide a better understanding of models and have both theoretical and practical applications. To facilitate the use of our models we generalize the work in Brunner, Chan, James and Lo (2001) by extending their weighted Chinese restaurant (WCR) Monte Carlo procedure, an i.i.d. sequential importance sampling (SIS) procedure for approximating posterior mean functionals based on the Dirichlet process, to the case of approximation of mean functionals and additionally their posterior laws in species sampling mixture models. We also discuss collapsed Gibbs sampling, Pólya urn Gibbs sampling and a Pólya urn SIS scheme. Our framework allows for numerous applications, including multiplicative counting process models subject to weighted gamma processes, as well as nonparametric and semiparametric hierarchical models based on the Dirichlet process, its twoparameter extension, the PitmanYor process and finite dimensional Dirichlet priors. Key words and phrases: Dirichlet process, exchangeable partition, finite dimensional Dirichlet prior, twoparameter PoissonDirichlet process, prediction rule, random probability measure, species sampling sequence.
Space and SpaceTime Modeling Using Process Convolutions
"... . A continuous spatial model can be constructed by convolving a very simple, perhaps independent, process with a kernel or point spread function. This approach for constructing a spatial process o#ers a number of advantages over specification through a spatial covariogram. In particular, this proces ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
. A continuous spatial model can be constructed by convolving a very simple, perhaps independent, process with a kernel or point spread function. This approach for constructing a spatial process o#ers a number of advantages over specification through a spatial covariogram. In particular, this process convolution specification leads to compuational simplifications and easily extends beyond simple stationary models. This paper uses process convolution models to build space and spacetime models that are flexible and able to accomodate large amounts of data. Data from environmental monitoring is considered. 1 Introduction Modeling spatial data with Gaussian processes is the common thread of all geostatistical analyses. Some notable references in this area include Matheron (1963), Journel and Huijbregts (1978), Ripley (1981), Cressie (1991), Wackernagel (1995), and Stein (1999). A common approach is to model spatial dependence through the covariogram c(), so that covariance between any t...
Poisson process partition calculus with an application to Bayesian . . .
, 2005
"... This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The P ..."
Abstract

Cited by 32 (10 self)
 Add to MetaCart
This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The Poisson disintegration method is based on the formal statement of two results concerning a Laplace functional change of measure and a Poisson Palm/Fubini calculus in terms of random partitions of the integers {1,...,n}. The techniques are analogous to, but much more general than, techniques for the Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
Shot noise Cox processes
 Advances in Applied Probability 35
, 2003
"... We introduce a new class of Cox cluster processes called generalised shotnoise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can be ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
We introduce a new class of Cox cluster processes called generalised shotnoise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results for first and second order moment measures, reduced Palm distributions, the Jfunction, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation to corresponding results for SNCPs.
Computational Methods for Multiplicative Intensity Models using Weighted Gamma . . .
 PROCESSES: PROPORTIONAL HAZARDS, MARKED POINT PROCESSES AND PANEL COUNT DATA
, 2004
"... We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share many common structural features with the posterior distributions of Bayesian hierarchical models using the Dirichlet process. Using this fact, along with an approximation for the weighted gamma process, we show that with some care, one can adapt efficient algorithms used for the Dirichlet process to this setting. We discuss blocked Gibbs sampling procedures and Pólya urn Gibbs samplers. We illustrate our methods with applications to proportional hazard models, Poisson spatial regression models, recurrent events, and panel count data.
Bayesian mixture modeling for spatial Poisson process intensities, with applications to extreme value analysis
 Dept
, 2005
"... Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the spa ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the space where the process is defined. The method is based on modeling a density function, defined on this bounded region, that is directly related with the intensity function of the Poisson process. We develop a flexible nonparametric mixture model for this density using a bivariate Beta distribution for the mixture kernel and a Dirichlet process prior for the mixing distribution. Using posterior simulation methods, we obtain full inference for the intensity function and any other functional of the process that might be of interest. We discuss applications to problems where inference for clustering in the spatial point pattern is of interest. Moreover, we consider applications of the methodology to extreme value analysis problems. We illustrate the modeling approach with three previously published data sets. Two of the data sets are from forestry and consist of locations of trees. The third data set consists of extremes from the Dow Jones index over a period of 1303 days.
Likelihoodbased inference for clustered line transect data
 J. Agric. Biol. Environ. Stat
, 2006
"... data ..."
A Bayes method for a monotone hazard rate via Spaths
 Ann. Statist
, 2006
"... A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tract ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tractable Bayes estimator for the hazard rate is derived to be a finite sum over Spaths. The path characterization or the estimator is proved to be a RaoBlackwellization of an existing partition characterization or partitionsum estimator. This accentuates the importance of Spath in Bayesian modeling of monotone hazard rates. An efficient Markov chain Monte Carlo (MCMC) method is proposed to approximate this class of estimates. It is shown that Spath characterization also exists in modeling with covariates by a proportional hazard model, and the proposed algorithm again applies. Numerical results of the method are given to demonstrate its practicality and effectiveness.
Conjugate Gamma Markov random fields for modelling nonstationary sources
"... Abstract. In modelling nonstationary sources, one possible strategy is to define a latent process of strictly positive variables to model variations in second order statistics of the underlying process. This can be achieved, for example, by passing a Gaussian process through a positive nonlinearity ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Abstract. In modelling nonstationary sources, one possible strategy is to define a latent process of strictly positive variables to model variations in second order statistics of the underlying process. This can be achieved, for example, by passing a Gaussian process through a positive nonlinearity or defining a discrete state Markov chain where each state encodes a certain regime. However, models with such constructs turn out to be either not very flexible or nonconjugate, making inference somewhat harder. In this paper, we introduce a conjugate (inverse) gamma Markov Random field model that allows random fluctuations on variances which are useful as priors for nonstationary timefrequency energy distributions. The main idea is to introduce auxiliary variables such that full conditional distributions and sufficient statistics are readily available as closed form expressions. This allows straightforward implementation of a Gibbs sampler or a variational algorithm. We illustrate our approach on denoising and single channel source separation. 1