Results 1  10
of
382,904
NORGES TEKNISKNATURVITENSKAPELIGE UNIVERSITET Control variates for the MetropolisHastings algorithm
"... We propose new control variates for variance reduction in the Metropolis–Hastings algorithm. We use variates that are functions of both the current state of the Markov chain and the proposed new state. This enable us to specify control variates which have known mean values for general target and pro ..."
Abstract
 Add to MetaCart
and proposal distributions. We develop the ideas for both the standard Metropolis–Hastings algorithm and the generalized reversible jump version. We present simulation results for four simulation examples. The variance reduction varies depending on the target distribution and proposal mechanisms used
On Adaptive MetropolisHastings Methods
"... This paper presents a method for adaptation in MetropolisHastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since the curr ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper presents a method for adaptation in MetropolisHastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since
Kernel Adaptive MetropolisHastings
"... A Kernel Adaptive MetropolisHastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support. The algorithm embeds the trajectory of the Markov chain into a reproducing kernel Hilbert space (RKHS), such that the feature space covariance o ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
A Kernel Adaptive MetropolisHastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support. The algorithm embeds the trajectory of the Markov chain into a reproducing kernel Hilbert space (RKHS), such that the feature space covariance
Sequential Monte Carlo Samplers
, 2002
"... In this paper, we propose a general algorithm to sample sequentially from a sequence of probability distributions known up to a normalizing constant and defined on a common space. A sequence of increasingly large artificial joint distributions is built; each of these distributions admits a marginal ..."
Abstract

Cited by 303 (44 self)
 Add to MetaCart
corresponds to a nonlinear Markov kernel admitting a specified invariant distribution and is a natural nonlinear extension of the standard MetropolisHastings algorithm. Many theoretical results have already been established for such flows and their particle approximations. We demonstrate the use
Cité Descartes Champs sur Marne
"... Abstract. The wasterecycling Monte Carlo (WR) algorithm, introduced by Frenkel, is a modification of the MetropolisHastings algorithm, which makes use of all the proposals, whereas the standard MetropolisHastings algorithm only uses the accepted proposals. We prove the convergence of the WR algor ..."
Abstract
 Add to MetaCart
Abstract. The wasterecycling Monte Carlo (WR) algorithm, introduced by Frenkel, is a modification of the MetropolisHastings algorithm, which makes use of all the proposals, whereas the standard MetropolisHastings algorithm only uses the accepted proposals. We prove the convergence of the WR
QuasiNewton particle MetropolisHastings?
"... Abstract: Particle MetropolisHastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new ..."
Abstract
 Add to MetaCart
Abstract: Particle MetropolisHastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a
Approximate MetropolisHastings Experiments Background Cutting MetropolisHastings Budget
"... 1. Draw a candidate state θ ′ from a proposal distribution q(θ′θt) 2. Compute the acceptance probability: Pa = min ..."
Abstract
 Add to MetaCart
1. Draw a candidate state θ ′ from a proposal distribution q(θ′θt) 2. Compute the acceptance probability: Pa = min
doi:10.1093/amrx/abu003 Efficiency of the Wang–Landau Algorithm: A Simple Test Case
"... We analyze the efficiency of the Wang–Landau algorithm to sample a multimodal distribution on a prototypical simple test case. We show that the exit time from a metastable state is much smaller for the Wang–Landau dynamics than for the original standard Metropolis–Hastings algorithm, in some asympt ..."
Abstract
 Add to MetaCart
We analyze the efficiency of the Wang–Landau algorithm to sample a multimodal distribution on a prototypical simple test case. We show that the exit time from a metastable state is much smaller for the Wang–Landau dynamics than for the original standard Metropolis–Hastings algorithm, in some
Modified MetropolisHastings algorithm with delayed rejection
 AsianPacific Symposium on Structural Reliability and its Applications, Hong Kong
, 2008
"... The development of an efficient MCMC strategy for sampling from complex distributions is a difficult task that needs to be solved for calculating small failure probabilities encountered in highdimensional reliability analysis of engineering systems. Usually different variations of the MetropolisHa ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
algorithm (MH) are used. However, the standard MH algorithm does generally not work in high dimensions, since it leads to very frequent repeated samples. In order to overcome this deficiency one can use the Modified MetropolisHastings algorithm (MMH) proposed in Au & Beck 2001. Another variation
Results 1  10
of
382,904