Results 1  10
of
29
Simulating Normalized Constants: From Importance Sampling to Bridge Sampling to Path Sampling
 Statistical Science, 13, 163–185. COMPARISON OF METHODS FOR COMPUTING BAYES FACTORS 435
, 1998
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Bayesian Calibration of Computer Models
 Journal of the Royal Statistical Society, Series B, Methodological
, 2000
"... this paper a Bayesian approach to the calibration of computer models. We represent the unknown inputs as a parameter vector `. Using the observed data we derive the posterior distribution of `, which in particular quantifies the `residual uncertainty' about ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
this paper a Bayesian approach to the calibration of computer models. We represent the unknown inputs as a parameter vector `. Using the observed data we derive the posterior distribution of `, which in particular quantifies the `residual uncertainty' about
Some Adaptive Monte Carlo Methods for Bayesian Inference
 Statistics in Medicine
"... This paper outlines some of the issues in developing adaptive methods and presents some preliminary results. 1 Introduction ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
This paper outlines some of the issues in developing adaptive methods and presents some preliminary results. 1 Introduction
Computing Nonparametric Hierarchical Models
, 1998
"... Bayesian models involving Dirichlet process mixtures are at the heart of the modern nonparametric Bayesian movement. Much of the rapid development of these models in the last decade has been a direct result of advances in simulationbased computational methods. Some of the very early work in thi ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Bayesian models involving Dirichlet process mixtures are at the heart of the modern nonparametric Bayesian movement. Much of the rapid development of these models in the last decade has been a direct result of advances in simulationbased computational methods. Some of the very early work in this area, circa 19881991, focused on the use of such nonparametric ideas and models in applications of otherwise standard hierarchical models. This chapter provides some historical review and perspective on these developments, with a prime focus on the use and integration of such nonparametric ideas in hierarchical models. We illustrate the ease with which the strict parametric assumptions common to most standard Bayesian hierarchical models can be relaxed to incorporate uncertainties about functional forms using Dirichlet process components, partly enabled by the approach to computation using MCMC methods. The resulting methology is illustrated with two examples taken from an unpub...
SubregionAdaptive Integration of Functions Having a Dominant Peak
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 1993
"... Many statistical multiple integration problems involve integrands that have a dominant peak. In applying numerical methods to solve these problems, statisticians have paid relatively little attention to existing quadrature methods and available software developed in the numerical analysis literature ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
Many statistical multiple integration problems involve integrands that have a dominant peak. In applying numerical methods to solve these problems, statisticians have paid relatively little attention to existing quadrature methods and available software developed in the numerical analysis literature. One reason these methods have been largely overlooked, even though they are known to be more efficient than Monte Carlo for wellbehaved problems of low dimensionality, may be that when applied naively they are poorly suited for peakedintegrand problems. In this paper we use transformations based on "splitt" distributions to allow the integrals to be efficiently computed using a subregionadaptive numerical integration algorithm. Our splitt distributions are modifications of those suggested by Geweke (1989) and may also be used to define Monte Carlo importance functions. We then compare our approach to Monte Carlo. In the several examples we examine here, we find subregionadaptive inte...
Bayesian Analysis For Simulation Input And Output
, 1997
"... The paper summarizes some important results at the intersection of the fields of Bayesian statistics and stochastic simulation. Two statistical analysis issues for stochastic simulation are discussed in further detail from a Bayesian perspective. First, a review of recent work in input distribution ..."
Abstract

Cited by 20 (8 self)
 Add to MetaCart
The paper summarizes some important results at the intersection of the fields of Bayesian statistics and stochastic simulation. Two statistical analysis issues for stochastic simulation are discussed in further detail from a Bayesian perspective. First, a review of recent work in input distribution selection is presented. Then, a new Bayesian formulation for the problem of output analysis for a single system is presented. A key feature is analyzing simulation output as a random variable whose parameters are an unknown function of the simulation's inputs. The distribution of those parameters is inferred from simulation output via Bayesian responsesurface methods. A brief summary of Bayesian inference and decision making is included for reference.
Computing Normalizing Constants for Finite Mixture Models via Incremental Mixture Importance Sampling (IMIS)
, 2003
"... We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive imp ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive importance sampling function which is itself a mixture, with two types of component distributions, one concentrated and one diffuse. The more concentrated type of component serves the usual purpose of an importance sampling function, sampling mostly group assignments of high posterior probability. The less concentrated type of component allows for the importance sampling function to explore the space in a controlled way to find other, unvisited assignments with high posterior probability. Components are added adaptively, one at a time, to cover areas of high posterior probability not well covered by the current important sampling function. The method is called Incremental Mixture Importance Sampling (IMIS). IMIS is easy to implement and to monitor for convergence. It scales easily for higher dimensional
A Bayesian Approach to Characterizing Uncertainty in Inverse Problems Using Coarse and Fine Scale Information
, 2001
"... The Bayesian approach allows one to easily quantify uncertainty, at least in theory. In practice, however, MCMC can be computationally expensive, particularly in complicated inverse problems. Here we present methodology for improving the speed and efficiency of an MCMC analysis by combining runs on ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
The Bayesian approach allows one to easily quantify uncertainty, at least in theory. In practice, however, MCMC can be computationally expensive, particularly in complicated inverse problems. Here we present methodology for improving the speed and efficiency of an MCMC analysis by combining runs on different scales. By using a coarser scale, the chain can run faster (particularly when there is an external forward simulator involved in the likelihood evaluation) and better explore the posterior, being less likely to become stuck in local maxima. We discuss methods for linking the coarse chain back to the original fine scale chain of interest. The resulting coupled chain can thus be run more efficiently without sacrificing the accuracy achieved at the finer scale.
2008: Adaptive methods for sequential importance sampling with application to state space models
 Statistics and Computing
"... Abstract. In this paper we discuss new adaptive proposal strategies for sequential Monte Carlo algorithms—also known as particle filters—relying on criteria evaluating the quality of the proposed particles. The choice of the proposal distribution is a major concern and can dramatically influence the ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract. In this paper we discuss new adaptive proposal strategies for sequential Monte Carlo algorithms—also known as particle filters—relying on criteria evaluating the quality of the proposed particles. The choice of the proposal distribution is a major concern and can dramatically influence the quality of the estimates. Thus, we show how the longused coefficient of variation (suggested by Kong et al. (1994)) of the weights can be used for estimating the chisquare distance between the target and instrumental distributions of the auxiliary particle filter. As a byproduct of this analysis we obtain an auxiliary adjustment multiplier weight type for which this chisquare distance is minimal. Moreover, we establish an empirical estimate of linear complexity of the KullbackLeibler divergence between the involved distributions. Guided by these results, we discuss adaptive designing of the particle filter proposal distribution and illustrate the methods on a numerical example. 1.