Results 1  10
of
25
Modelling heterogeneity with and without the Dirichlet process
, 2001
"... We investigate the relationships between Dirichlet process (DP) based models and allocation models for a variable number of components, based on exchangeable distributions. It is shown that the DP partition distribution is a limiting case of a Dirichlet± multinomial allocation model. Comparisons of ..."
Abstract

Cited by 68 (3 self)
 Add to MetaCart
We investigate the relationships between Dirichlet process (DP) based models and allocation models for a variable number of components, based on exchangeable distributions. It is shown that the DP partition distribution is a limiting case of a Dirichlet± multinomial allocation model. Comparisons of posterior performance of DP and allocation models are made in the Bayesian paradigm and illustrated in the context of univariate mixture models. It is shown in particular that the unbalancedness of the allocation distribution, present in the prior DP model, persists a posteriori. Exploiting the model connections, a new MCMC sampler for general DP based models is introduced, which uses split/merge moves in a reversible jump framework. Performance of this new sampler relative to that of some traditional samplers for DP processes is then explored.
Transdimensional Markov chain Monte Carlo
 in Highly Structured Stochastic Systems
, 2003
"... In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a re ..."
Abstract

Cited by 56 (0 self)
 Add to MetaCart
In the context of samplebased computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed invariant distribution whose support is a union of euclidean spaces of differing dimensions. This leads into a reformulation of the reversible jump MCMC framework for constructing such ‘transdimensional ’ Markov chains. This framework is compared to alternative approaches for the same task, including methods that involve separate sampling within different fixeddimension models. We consider some of the difficulties researchers have encountered with obtaining adequate performance with some of these methods, attributing some of these to misunderstandings, and offer tentative recommendations about algorithm choice for various classes of problem. The chapter concludes with a look towards desirable future developments.
Spatial Poisson Regression for Health and Exposure Data Measured at Disparate Resolutions
 Journal of the American Statistical Association
, 2000
"... This paper presents a spatial regression analysis of the effect of traffic pollution on respiratory disorders in children. The analysis features data measured at disparate, nonnested scales, including spatially varying covariates, latent spatially varying risk factors, and casespecific individual ..."
Abstract

Cited by 38 (9 self)
 Add to MetaCart
This paper presents a spatial regression analysis of the effect of traffic pollution on respiratory disorders in children. The analysis features data measured at disparate, nonnested scales, including spatially varying covariates, latent spatially varying risk factors, and casespecific individual attributes
MCMC Methods for Computing Bayes Factors: A Comparative Review
 Journal of the American Statistical Association
, 2000
"... this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint modelparameter space search methods perform adequately but can be difficult to program and tune, while the marginal likelihood methods are often less troublesome and require less in the way of additional coding. Our results suggest that the latter methods may be most appropriate for practitioners working in many standard model choice settings, while the former remain important for comparing large numbers of models, or models whose parameters cannot be easily updated in relatively few blocks. We caution however that all of the methods we compare require significant human and computer effort, suggesting that less formal Bayesian model choice methods may offer a more realistic alternative in many cases.
Bayesian Model Assessment and Comparison Using CrossValidation Predictive Densities
 Neural Computation
, 2002
"... In this work, we discuss practical methods for the assessment, comparison, and selection of complex hierarchical Bayesian models. A natural way to assess the goodness of the model is to estimate its future predictive capability by estimating expected utilities. Instead of just making a point estimat ..."
Abstract

Cited by 26 (10 self)
 Add to MetaCart
In this work, we discuss practical methods for the assessment, comparison, and selection of complex hierarchical Bayesian models. A natural way to assess the goodness of the model is to estimate its future predictive capability by estimating expected utilities. Instead of just making a point estimate, it is important to obtain the distribution of the expected utility estimate, as it describes the uncertainty in the estimate. The distributions of the expected utility estimates can also be used to compare models, for example, by computing the probability of one model having a better expected utility than some other model. We propose an approach using crossvalidation predictive densities to obtain expected utility estimates and Bayesian bootstrap to obtain samples from their distributions. We also discuss the probabilistic assumptions made and properties of two practical crossvalidation methods, importance sampling and kfold crossvalidation. As illustrative examples, we use MLP neural networks and Gaussian Processes (GP) with Markov chain Monte Carlo sampling in one toy problem and two challenging realworld problems.
Bayesian Modelling of Inseparable SpaceTime Variation in Disease Risk
, 1998
"... This paper proposes a unified framework for the analysis of incidence or mortality data in space and time. The problem with such analysis is that the number of cases and the corresponding population at risk in any single unit of space \Theta time are too small to produce a reliable estimate of the u ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
This paper proposes a unified framework for the analysis of incidence or mortality data in space and time. The problem with such analysis is that the number of cases and the corresponding population at risk in any single unit of space \Theta time are too small to produce a reliable estimate of the underlying disease risk without "borrowing strength" from neighbouring cells. The goal here could be described as one of smoothing, in which both spatial and nonspatial considerations may arise, and spatiotemporal interactions may become an important feature. Based on an extended version of the main effects model proposed in KnorrHeld and Besag (1998), four generic types of space \Theta time interactions are introduced. Each type implies a certain degree of prior (in)dependence for interaction parameters, and corresponds to the product of one of the two spatial main effects with one of the two temporal main effects. Data analysis is implemented via Markov chain Monte Carlo methods. The methodology is illustrated by an analysis of Ohio lung cancer data 196888. We compare the fit and the complexity of each model by the DIC criterion, recently proposed in Spiegelhalter et al. (1998).
Adaptive Bayesian Regression Splines in Semiparametric Generalized Linear Models
 Journal of Computational and Graphical Statistics
, 1998
"... This paper presents a fully Bayesian approach to regression splines with automatic knot selection in generalized semiparametric models for fundamentally nonGaussian responses. In a basis function representation of the regression spline we use a Bspline basis. The reversible jump Markov chain Mon ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
This paper presents a fully Bayesian approach to regression splines with automatic knot selection in generalized semiparametric models for fundamentally nonGaussian responses. In a basis function representation of the regression spline we use a Bspline basis. The reversible jump Markov chain Monte Carlo method allows for simultaneous estimation both of the number of knots and the knot placement, together with the unknown basis coefficients determining the shape of the spline. Since the spline can be represented as design matrix times unknown (basis) coefficients, it is straightforward to include additionally a vector of covariates with fixed effects, yielding a semiparametric model. The method is illustrated with data sets from the literature for curve estimation in generalized linear models, the Tokyo rainfall data and the coal mining disaster data, and by a creditscoring problem for generalized semiparametric models. Keywords: Bspline basis; knot selection; nonnormal response...
Bayesian SpatioTemporal Inference in Functional Magnetic Resonance Imaging
, 2001
"... this article is to present hierarchical Bayesian approaches that allow to simultaneously incorporate temporal and spatial dependencies between pixels directly in the model formulation. For reasons of computational feasibility, models have to be comparatively parsimonious, without oversimplifying. We ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
this article is to present hierarchical Bayesian approaches that allow to simultaneously incorporate temporal and spatial dependencies between pixels directly in the model formulation. For reasons of computational feasibility, models have to be comparatively parsimonious, without oversimplifying. We introduce parametric and semiparametric spatial and spatiotemporal models that proved appropriate and illustrate their performance by application to fMRI data from a visual stimulation experiment.
Function Estimation With Locally Adaptive Dynamic Models
 Computational Statistics
, 1998
"... this paper, we present a Bayesian nonparametric approach, which is more closely related to spline fitting with locally adaptive penalties. Abramovich and Steinberg (1996) generalize the common penalized least squares criterion for smoothing splines with a global smoothing parameter by introducing a ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
this paper, we present a Bayesian nonparametric approach, which is more closely related to spline fitting with locally adaptive penalties. Abramovich and Steinberg (1996) generalize the common penalized least squares criterion for smoothing splines with a global smoothing parameter by introducing a variable smoothing parameter into the roughness penalty. For estimation, they propose a twostep procedure: First a smoothing spline is fitted with a constant smoothing parameter chosen by generalized crossvalidation. Then an estimate for the variable smoothing parameter is constructed, based on the derivatives of this pilot estimate, and is plugged into their locally adaptive penalty to fit the smoothing spline in a second step. Ruppert and Carroll (2000) propose Psplines based on a truncated power series basis and di#erence penalties on the regression coe#cients with locally adaptive smoothing parameters. The latter are obtained by linear interpolation from a smaller number of smoothing parameters, defined for a subset of knots and estimated by generalized crossvalidation
Penalized loss functions for Bayesian model comparison
"... The deviance information criterion (DIC) is widely used for Bayesian model comparison, despite the lack of a clear theoretical foundation. DIC is shown to be an approximation to a penalized loss function based on the deviance, with a penalty derived from a crossvalidation argument. This approximati ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
The deviance information criterion (DIC) is widely used for Bayesian model comparison, despite the lack of a clear theoretical foundation. DIC is shown to be an approximation to a penalized loss function based on the deviance, with a penalty derived from a crossvalidation argument. This approximation is valid only when the effective number of parameters in the model is much smaller than the number of independent observations. In disease mapping, a typical application of DIC, this assumption does not hold and DIC underpenalizes more complex models. Another deviancebased loss function, derived from the same decisiontheoretic framework, is applied to mixture models, which have previously been considered an unsuitable application for DIC.