## Adaptively scaling the Metropolis algorithm using expected squared jumped distance (2003)

Citations: | 14 - 0 self |

### BibTeX

@TECHREPORT{Pasarica03adaptivelyscaling,

author = {Cristian Pasarica and Andrew Gelman},

title = {Adaptively scaling the Metropolis algorithm using expected squared jumped distance},

institution = {},

year = {2003}

}

### OpenURL

### Abstract

Using existing theory on efficient jumping rules and on adaptive MCMC, we construct and demonstrate the effectiveness of a workable scheme for improving the efficiency of Metropolis algorithms. A good choice of the proposal distribution is crucial for the rapid convergence of the Metropolis algorithm. In this paper, given a family of parametric Markovian kernels, we develop an algorithm for optimizing the kernel by maximizing the expected squared jumped distance, an objective function that characterizes the Markov chain under its d-dimensional stationary distribution. The algorithm uses the information accumulated by a single path and adapts the choice of the parametric kernel in the direction of the local maximum of the objective function using multiple importance sampling techniques. We follow a two-stage approach: a series of adaptive optimization steps followed by an MCMC run with fixed kernel. It is not necessary for the adaptation itself to converge. Using several examples, we demonstrate the effectiveness of our method, even for cases in which the Metropolis transition kernel is initialized at very poor values.