Results 1  10
of
56
Modelbased Geostatistics
 Applied Statistics
, 1998
"... Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random v ..."
Abstract

Cited by 128 (6 self)
 Add to MetaCart
Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random variables. We describe two spatial applications for which Gaussian distributional assumptions are clearly inappropriate. The first concerns the assessment of residual contamination from nuclear weapons testing on a South Pacific island, in which the sampling method generates spatially indexed Poisson counts conditional on an unobserved spatially varying intensity of radioactivity; we conclude that a coventional geostatistical analysis oversmooths the data and underestimates the spatial extremes of the intensity. The second application provides a description of spatial variation in the risk of campylobacter infections relative to other enteric infections in part of North Lancashire and South C...
Generalized weighted Chinese restaurant processes for species sampling mixture models
 Statistica Sinica
, 2003
"... Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conj ..."
Abstract

Cited by 59 (8 self)
 Add to MetaCart
Abstract: The class of species sampling mixture models is introduced as an extension of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors, or equivalently the class of all exchangeable urn distributions. Using Fubini calculus in conjunction with Pitman (1995, 1996), we derive characterizations of the posterior distribution in terms of a posterior partition distribution that extend the results of Lo (1984) for the Dirichlet process. These results provide a better understanding of models and have both theoretical and practical applications. To facilitate the use of our models we generalize the work in Brunner, Chan, James and Lo (2001) by extending their weighted Chinese restaurant (WCR) Monte Carlo procedure, an i.i.d. sequential importance sampling (SIS) procedure for approximating posterior mean functionals based on the Dirichlet process, to the case of approximation of mean functionals and additionally their posterior laws in species sampling mixture models. We also discuss collapsed Gibbs sampling, Pólya urn Gibbs sampling and a Pólya urn SIS scheme. Our framework allows for numerous applications, including multiplicative counting process models subject to weighted gamma processes, as well as nonparametric and semiparametric hierarchical models based on the Dirichlet process, its twoparameter extension, the PitmanYor process and finite dimensional Dirichlet priors. Key words and phrases: Dirichlet process, exchangeable partition, finite dimensional Dirichlet prior, twoparameter PoissonDirichlet process, prediction rule, random probability measure, species sampling sequence.
Poisson process partition calculus with an application to Bayesian . . .
, 2005
"... This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The P ..."
Abstract

Cited by 42 (10 self)
 Add to MetaCart
This article develops, and describes how to use, results concerning disintegrations of Poisson random measures. These results are fashioned as simple tools that can be tailormade to address inferential questions arising in a wide range of Bayesian nonparametric and spatial statistical models. The Poisson disintegration method is based on the formal statement of two results concerning a Laplace functional change of measure and a Poisson Palm/Fubini calculus in terms of random partitions of the integers {1,...,n}. The techniques are analogous to, but much more general than, techniques for the Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
Space and SpaceTime Modeling Using Process Convolutions
"... . A continuous spatial model can be constructed by convolving a very simple, perhaps independent, process with a kernel or point spread function. This approach for constructing a spatial process o#ers a number of advantages over specification through a spatial covariogram. In particular, this proces ..."
Abstract

Cited by 41 (4 self)
 Add to MetaCart
. A continuous spatial model can be constructed by convolving a very simple, perhaps independent, process with a kernel or point spread function. This approach for constructing a spatial process o#ers a number of advantages over specification through a spatial covariogram. In particular, this process convolution specification leads to compuational simplifications and easily extends beyond simple stationary models. This paper uses process convolution models to build space and spacetime models that are flexible and able to accomodate large amounts of data. Data from environmental monitoring is considered. 1 Introduction Modeling spatial data with Gaussian processes is the common thread of all geostatistical analyses. Some notable references in this area include Matheron (1963), Journel and Huijbregts (1978), Ripley (1981), Cressie (1991), Wackernagel (1995), and Stein (1999). A common approach is to model spatial dependence through the covariogram c(), so that covariance between any t...
Generalised Shot noise Cox processes
 ADVANCES IN APPLIED PROBABILITY 35
, 2003
"... We introduce a new class of Cox cluster processes called generalised shotnoise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can b ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
We introduce a new class of Cox cluster processes called generalised shotnoise Cox processes (GSNCPs), which extends the definition of shot noise Cox processes (SNCPs) in two directions: the point process which drives the shot noise is not necessarily Poisson, and the kernel of the shot noise can be random. Thereby a very large class of models for aggregated or clustered point patterns is obtained. Due to the structure of GSNCPs, a number of useful results can be established. We focus first on deriving summary statistics for GSNCPs and next on how to make simulation for GSNCPs. Particularly, results for first and second order moment measures, reduced Palm distributions, the Jfunction, simulation with or without edge effects, and conditional simulation of the intensity function driving a GSNCP are given. Our results are exemplified for special important cases of GSNCPs, and we discuss the relation to corresponding results for SNCPs.
2002), ‘ Combining Incompatible Spatial Data
 Journal of the American Statistical Association, June
"... data from a variety of sources. New advances in satellite imagery and remote sensing now permit scientists to access spatial data at several different resolutions. The Internet facilitates fast and easy data acquisition. In any one study, several different types of data may be collected at differing ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
data from a variety of sources. New advances in satellite imagery and remote sensing now permit scientists to access spatial data at several different resolutions. The Internet facilitates fast and easy data acquisition. In any one study, several different types of data may be collected at differing scales and resolutions, at different spatial locations, and in different dimensions. Many statistical issues are associated with combining such data for modeling and inference. This article gives an overview of these issues and the approaches for integrating such disparate data, drawing on work from geography, ecology, agriculture, geology, and statistics. Emphasis is on stateoftheart statistical solutions to this complex and important problem. KEY WORDS: Change of support; Data assimilation; Ecological inference; Modi able areal unit problem; Multiscale processes; Spatiallymisaligned data.
Computational Methods for Multiplicative Intensity Models using Weighted Gamma . . .
 PROCESSES: PROPORTIONAL HAZARDS, MARKED POINT PROCESSES AND PANEL COUNT DATA
, 2004
"... We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
We develop computational procedures for a class of Bayesian nonparametric and semiparametric multiplicative intensity models incorporating kernel mixtures of spatial weighted gamma measures. A key feature of our approach is that explicit expressions for posterior distributions of these models share many common structural features with the posterior distributions of Bayesian hierarchical models using the Dirichlet process. Using this fact, along with an approximation for the weighted gamma process, we show that with some care, one can adapt efficient algorithms used for the Dirichlet process to this setting. We discuss blocked Gibbs sampling procedures and Pólya urn Gibbs samplers. We illustrate our methods with applications to proportional hazard models, Poisson spatial regression models, recurrent events, and panel count data.
Bayesian mixture modeling for spatial Poisson process intensities, with applications to extreme value analysis
 Dept
, 2005
"... Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the spa ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Abstract: We propose a method for the analysis of a spatial point pattern, which is assumed to arise as a set of observations from a spatial nonhomogeneous Poisson process. The spatial point pattern is observed in a bounded region, which, for most applications, is taken to be a rectangle in the space where the process is defined. The method is based on modeling a density function, defined on this bounded region, that is directly related with the intensity function of the Poisson process. We develop a flexible nonparametric mixture model for this density using a bivariate Beta distribution for the mixture kernel and a Dirichlet process prior for the mixing distribution. Using posterior simulation methods, we obtain full inference for the intensity function and any other functional of the process that might be of interest. We discuss applications to problems where inference for clustering in the spatial point pattern is of interest. Moreover, we consider applications of the methodology to extreme value analysis problems. We illustrate the modeling approach with three previously published data sets. Two of the data sets are from forestry and consist of locations of trees. The third data set consists of extremes from the Dow Jones index over a period of 1303 days.
A Bayes method for a monotone hazard rate via Spaths
 Ann. Statist
, 2006
"... A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tract ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
A class of random hazard rates, that is defined as a mixture of an indicator kernel convoluted with a completely random measure, is of interest. We provide an explicit characterization of the posterior distribution of this mixture hazard rate model via a finite mixture of Spaths. A closed and tractable Bayes estimator for the hazard rate is derived to be a finite sum over Spaths. The path characterization or the estimator is proved to be a RaoBlackwellization of an existing partition characterization or partitionsum estimator. This accentuates the importance of Spath in Bayesian modeling of monotone hazard rates. An efficient Markov chain Monte Carlo (MCMC) method is proposed to approximate this class of estimates. It is shown that Spath characterization also exists in modeling with covariates by a proportional hazard model, and the proposed algorithm again applies. Numerical results of the method are given to demonstrate its practicality and effectiveness.
Nonparametric models for proteomic peak identification and quantification. Bayesian Inference for Gene Expression and Proteomics
, 2006
"... We present modelbased inference for proteomic peak identification and quantification from mass spectroscopy data, focusing on nonparametric Bayesian models. Using experimental data generated from MALDITOF mass spectroscopy (Matrix Assisted Laser Desorption Ionization Time of Flight) we model obser ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
We present modelbased inference for proteomic peak identification and quantification from mass spectroscopy data, focusing on nonparametric Bayesian models. Using experimental data generated from MALDITOF mass spectroscopy (Matrix Assisted Laser Desorption Ionization Time of Flight) we model observed intensities in spectra with a hierarchical nonparametric model for expected intensity as a function of timeofflight. We express the unknown intensity function as a sum of kernel functions, a natural choice of basis functions for modelling spectral peaks. We discuss how to place prior distributions on the unknown functions using Lévy random fields and describe posterior inference via a reversible jump Markov chain Monte Carlo algorithm.