Results 11  20
of
288
Gibbs sampling, exponential families and orthogonal polynomials
 Statistical Sciences
, 2008
"... Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical ort ..."
Abstract

Cited by 40 (10 self)
 Add to MetaCart
(Show Context)
Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical orthogonal polynomials as eigenfunctions. Key words and phrases: Gibbs sampler, running time analyses, exponential families, conjugate priors, location families, orthogonal polynomials, singular value decomposition. 1.
THEORETICAL AND NUMERICAL COMPARISON OF SOME SAMPLING METHODS FOR MOLECULAR DYNAMICS
, 2005
"... The purpose of the present article is to compare different phasespace sampling methods, such as purely stochastic methods (Rejection method, Metropolized independence sampler, Importance Sampling), stochastically perturbed Molecular Dynamics (Hybrid Monte Carlo, Langevin Dynamics, Biased Random Wal ..."
Abstract

Cited by 39 (3 self)
 Add to MetaCart
(Show Context)
The purpose of the present article is to compare different phasespace sampling methods, such as purely stochastic methods (Rejection method, Metropolized independence sampler, Importance Sampling), stochastically perturbed Molecular Dynamics (Hybrid Monte Carlo, Langevin Dynamics, Biased Random Walk), and purely deterministic methods (NoséHoover chains, NoséPoincaré and Recursive Multiple Thermostats (RMT) methods). After recalling some theoretical convergence properties for the various methods, we provide some new convergence results for the Hybrid Monte Carlo scheme, requiring weaker (and easier to check) conditions than previously known conditions. We then turn to the numerical efficiency of the sampling schemes for a benchmark model of linear alkane molecules. In particular, the numerical distributions that are generated are compared in a systematic way, on the basis of some quantitative convergence indicators.
Structured learning and prediction in computer vision
 IN FOUNDATIONS AND TRENDS IN COMPUTER GRAPHICS AND VISION
, 2010
"... ..."
(Show Context)
Efficient block sampling strategies for sequential Monte Carlo
 Journal of Computational and Graphical Statistics
, 2006
"... Sequential Monte Carlo (SMC) methods are a powerful set of simulationbased techniques for sampling sequentially from a sequence of complex probability distributions. These methods rely on a combination of importance sampling and resampling techniques. In a Markov chain Monte Carlo (MCMC) framework, ..."
Abstract

Cited by 37 (7 self)
 Add to MetaCart
(Show Context)
Sequential Monte Carlo (SMC) methods are a powerful set of simulationbased techniques for sampling sequentially from a sequence of complex probability distributions. These methods rely on a combination of importance sampling and resampling techniques. In a Markov chain Monte Carlo (MCMC) framework, block sampling strategies often perform much better than algorithms based on oneatatime sampling strategies if “good ” proposal distributions to update blocks of variables can be designed. In an SMC framework, standard algorithms sequentially sample the variables one at a time whereas, like MCMC, the efficiency of algorithms could be improved significantly by using block sampling strategies. Unfortunately, a direct implementation of such strategies is impossible as it requires the knowledge of integrals which do not admit closedform expressions. This article introduces a new methodology which bypasses this problem and is a natural extension of standard SMC methods. Applications to several sequential Bayesian inference problems demonstrate these methods.
Preconditioning of Markov Chain Monte Carlo simulations using coarsescale models
 SIAM J. Sci. Comput
, 2006
"... We study the preconditioning of Markov Chain Monte Carlo (MCMC) methods using coarsescale models with applications to subsurface characterization. The purpose of preconditioning is to reduce the finescale computational cost and increase the acceptance rate in the MCMC sampling. This goal is achiev ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
(Show Context)
We study the preconditioning of Markov Chain Monte Carlo (MCMC) methods using coarsescale models with applications to subsurface characterization. The purpose of preconditioning is to reduce the finescale computational cost and increase the acceptance rate in the MCMC sampling. This goal is achieved by generating Markov chains based on twostage computations. In the first stage, a new proposal is first tested by the coarsescale model based on multiscale finitevolume method. The full finescale computation will be conducted only if the proposal passes the coarsescale screening. For more efficient simulations, an approximation of the full finescale computation using precomputed multiscale basis functions can also be used. Comparing with the regular MCMC method, the preconditioned MCMC method generates a modified Markov chain by incorporating the coarsescale information of the problem. The conditions under which the modified Markov chain will converge to the correct posterior distribution are stated in the paper. The validity of these assumptions for our application, and the conditions which would guarantee a high acceptance rate are also discussed. We would like to note that coarsescale models used in the simulations need to be inexpensive, but not necessarily very accurate, as our analysis and numerical simulations demonstrate. We present numerical examples for sampling permeability fields using twopoint geostatistics. The KarhunenLoeve expansion is used to represent the realizations of the permeability field conditioned to the dynamic data, such as production data, as well as some static data. Our numerical examples show that the acceptance rate can be increased by more than ten times if MCMC simulations are preconditioned using coarsescale models.
A basic convergence result for particle filtering
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2007
"... The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature o ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature on the theoretical aspects of the quality of the approximation. Still, a clear cut result that the approximate solution, for unbounded functions, converges to the true optimal estimate as the number of particles tends to in nity seems to be lacking. It is the purpose of this contribution to give such a basic convergence result.
Sampling the posterior: An approach to nongaussian data assimilation
, 2006
"... The viewpoint taken in this paper is that data assimilation is fundamentally a statistical problem and that this problem should be cast in a Bayesian framework. In the absence of model error, the correct solution to the data assimilation problem is to find the posterior distribution implied by this ..."
Abstract

Cited by 29 (10 self)
 Add to MetaCart
(Show Context)
The viewpoint taken in this paper is that data assimilation is fundamentally a statistical problem and that this problem should be cast in a Bayesian framework. In the absence of model error, the correct solution to the data assimilation problem is to find the posterior distribution implied by this Bayesian setting. Methods for dealing with data assimilation should then be judged by their ability to probe this distribution. In this paper we propose a range of techniques for probing the posterior distribution, based around the Langevin equation; and we compare these new techniques with existing methods. When the underlying dynamics is deterministic, the posterior distribution is on the space of initial conditions leading to a sampling problem over this space. When the underlying dynamics is stochastic the posterior distribution is on the space of continuous time paths. By writing down a density, and conditioning on observations, it is possible to define
Bayesian Hierarchical Modeling for Integrating LowAccuracy and HighAccuracy Experiments
 Technometrics
, 2008
"... Standard practice in analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. Careful adjustments need to be made to incorporate the systematic differences among va ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
Standard practice in analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. Careful adjustments need to be made to incorporate the systematic differences among various experiments. To this end, some Bayesian hierarchical Gaussian process models (BHGP) are proposed. The heterogeneity among different sources is accounted for by performing flexible location and scale adjustments. The approach tends to produce prediction closer to that from the highaccuracy experiment. The Bayesian computations are aided by the use of Markov chain Monte Carlo and Sample Average Approximation algorithms. The proposed method is illustrated with two examples: one with detailed and approximate finite elements simulations for mechanical material design and the other with physical and computer experiments for modeling a food processor.
Validation of software for bayesian models using posterior quantiles
 Journal of Computational and Graphical Statistics
"... We present a simulationbased method designed to establish that software developed to fit a specific Bayesian model works properly, capitalizing on properties of Bayesian posterior distributions. We illustrate the validation technique with two examples. The validation method is shown to find errors ..."
Abstract

Cited by 28 (6 self)
 Add to MetaCart
(Show Context)
We present a simulationbased method designed to establish that software developed to fit a specific Bayesian model works properly, capitalizing on properties of Bayesian posterior distributions. We illustrate the validation technique with two examples. The validation method is shown to find errors in software when they exist and, moreover, the validation output can be informative about the nature and location of such errors.
Projected equation methods for approximate solution of large linear systems
 JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS
"... ..."