Results 1  10
of
168
Annealed importance sampling
 In Statistics and Computing
, 2001
"... Abstract. Simulated annealing — moving from a tractable distribution to a distribution of interest via a sequence of intermediate distributions — has traditionally been used as an inexact method of handling isolated modes in Markov chain samplers. Here, it is shown how one can use the Markov chain t ..."
Abstract

Cited by 150 (3 self)
 Add to MetaCart
Abstract. Simulated annealing — moving from a tractable distribution to a distribution of interest via a sequence of intermediate distributions — has traditionally been used as an inexact method of handling isolated modes in Markov chain samplers. Here, it is shown how one can use the Markov chain transitions for such an annealing sequence to define an importance sampler. The Markov chain aspect allows this method to perform acceptably even for highdimensional problems, where finding good importance sampling distributions would otherwise be very difficult, while the use of importance weights ensures that the estimates found converge to the correct values as the number of annealing runs increases. This annealed importance sampling procedure resembles the second half of the previouslystudied tempered transitions, and can be seen as a generalization of a recentlyproposed variant of sequential importance sampling. It is also related to thermodynamic integration methods for estimating ratios of normalizing constants. Annealed importance sampling is most attractive when isolated modes are present, or when estimates of normalizing constants are required, but it may also be more generally useful, since its independent sampling allows one to bypass some of the problems of assessing convergence and autocorrelation in Markov chain samplers. 1
Simulating Normalized Constants: From Importance Sampling to Bridge Sampling to Path Sampling
 Statistical Science, 13, 163–185. COMPARISON OF METHODS FOR COMPUTING BAYES FACTORS 435
, 1998
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Sampling from Multimodal Distributions Using Tempered Transitions
 Statistics and Computing
, 1994
"... . I present a new Markov chain sampling method appropriate for distributions with isolated modes. Like the recentlydeveloped method of "simulated tempering", the "tempered transition" method uses a series of distributions that interpolate between the distribution of interest and a distribution for ..."
Abstract

Cited by 69 (6 self)
 Add to MetaCart
. I present a new Markov chain sampling method appropriate for distributions with isolated modes. Like the recentlydeveloped method of "simulated tempering", the "tempered transition" method uses a series of distributions that interpolate between the distribution of interest and a distribution for which sampling is easier. The new method has the advantage that it does not require approximate values for the normalizing constants of these distributions, which are needed for simulated tempering, and can be tedious to estimate. Simulated tempering performs a random walk along the series of distributions used. In contrast, the tempered transitions of the new method move systematically from the desired distribution, to the easilysampled distribution, and back to the desired distribution. This systematic movement avoids the inefficiency of a random walk, an advantage that unfortunately is cancelled by an increase in the number of interpolating distributions required. Because of this, the sa...
Fields of Experts
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2008
"... We develop a framework for learning generic, expressive image priors that capture the statistics of natural scenes and can be used for a variety of machine vision tasks. The approach provides a practical method for learning highorder Markov random field (MRF) models with potential functions that ex ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
We develop a framework for learning generic, expressive image priors that capture the statistics of natural scenes and can be used for a variety of machine vision tasks. The approach provides a practical method for learning highorder Markov random field (MRF) models with potential functions that extend over large pixel neighborhoods. These clique potentials are modeled using the ProductofExperts framework that uses nonlinear functions of many linear filter responses. In contrast to previous MRF approaches all parameters, including the linear filters themselves, are learned from training data. We demonstrate the capabilities of this FieldofExperts model with two example applications, image denoising and image inpainting, which are implemented using a simple, approximate inference scheme. While the model is trained on a generic image database and is not tuned toward a specific application, we obtain results that compete with specialized techniques.
An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants
 Biometrika
, 2006
"... Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method i ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method is presented which requires only that independent samples can be drawn from the unnormalised density at any particular parameter value. The proposal distribution is constructed so that the normalising constant cancels from the Metropolis–Hastings ratio. The method is illustrated by producing posterior samples for parameters of the Ising model given a particular lattice realisation.
Markov Chain Decomposition for Convergence Rate Analysis
"... In this paper we develop tools for analyzing the rate at which a reversible Markov chain converges to stationarity. Our techniques are useful when the Markov chain can be decomposed into pieces which are themselves easier to analyze. The main theorems relate the spectral gap of the original Markov c ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
In this paper we develop tools for analyzing the rate at which a reversible Markov chain converges to stationarity. Our techniques are useful when the Markov chain can be decomposed into pieces which are themselves easier to analyze. The main theorems relate the spectral gap of the original Markov chains to the spectral gap of the pieces. In the first case the pieces are restrictions of the Markov chain to subsets of the state space; the second case treats a MetropolisHastings chain whose equilibrium distribution is a weighted average of equilibrium distributions of other MetropolisHastings chains on the same state space.
Estimation of Markov Random Field prior parameters using Markov chain Monte Carlo Maximum Likelihood
, 1996
"... Recent developments in statistics now allow maximum likelihood estimators for the parameters of Markov Random Fields to be constructed. We detail the theory required, and present an algorithm which is easily implemented and practical in terms of computation time. We demonstrate this algorithm on thr ..."
Abstract

Cited by 39 (3 self)
 Add to MetaCart
Recent developments in statistics now allow maximum likelihood estimators for the parameters of Markov Random Fields to be constructed. We detail the theory required, and present an algorithm which is easily implemented and practical in terms of computation time. We demonstrate this algorithm on three MRF models the standard Potts model, an inhomogeneous variation of the Potts model, and a longrange interaction model, better adapted to modeling realworld images. We estimate the parameters from a synthetic and a real image, and then resynthesise the models to demonstrate which features of the image have been captured by the model. Segmentations are computed based on the estimated parameters and conclusions drawn.
Extended ensemble Monte Carlo
 Int. J. Mod. Phys
, 2001
"... “Extended Ensemble Monte Carlo ” is a generic term that indicates a set of algorithms which are now popular in a variety of fields in physics and statistical information processing. Exchange Monte Carlo (MetropolisCoupled Chain, Parallel Tempering), Simulated Tempering (Expanded Ensemble Monte Carl ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
“Extended Ensemble Monte Carlo ” is a generic term that indicates a set of algorithms which are now popular in a variety of fields in physics and statistical information processing. Exchange Monte Carlo (MetropolisCoupled Chain, Parallel Tempering), Simulated Tempering (Expanded Ensemble Monte Carlo), and Multicanonical Monte Carlo (Adaptive Umbrella Sampling) are typical members of this family. Here we give a crossdisciplinary survey of these algorithms with special emphasis on the great flexibility of the underlying idea. In Sec. 2, we discuss the background of Extended Ensemble Monte Carlo. In Sec. 3, 4 and 5, three types of the algorithms, i.e., Exchange Monte Carlo, Simulated Tempering, Multicanonical Monte Carlo, are introduced. In Sec. 6, we give an introduction to Replica Monte Carlo algorithm by Swendsen and Wang. Strategies for the construction of specialpurpose extended ensembles are discussed in Sec. 7. We stress
EQUIENERGY SAMPLER WITH APPLICATIONS IN STATISTICAL INFERENCE AND STATISTICAL MECHANICS
, 2006
"... We introduce a new sampling algorithm, the equienergy sampler, for efficient statistical sampling and estimation. Complementary to the widely used temperaturedomain methods, the equienergy sampler, utilizing the temperature–energy duality, targets the energy directly. The focus on the energy func ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
We introduce a new sampling algorithm, the equienergy sampler, for efficient statistical sampling and estimation. Complementary to the widely used temperaturedomain methods, the equienergy sampler, utilizing the temperature–energy duality, targets the energy directly. The focus on the energy function not only facilitates efficient sampling, but also provides a powerful means for statistical estimation, for example, the calculation of the density of states and microcanonical averages in statistical mechanics. The equienergy sampler is applied to a variety of problems, including exponential regression in statistics, motif sampling in computational biology and protein folding in biophysics.