Results 1 
4 of
4
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review
 Journal of the American Statistical Association
, 1996
"... A critical issue for users of Markov Chain Monte Carlo (MCMC) methods in applications is how to determine when it is safe to stop sampling and use the samples to estimate characteristics of the distribution of interest. Research into methods of computing theoretical convergence bounds holds promise ..."
Abstract

Cited by 223 (6 self)
 Add to MetaCart
A critical issue for users of Markov Chain Monte Carlo (MCMC) methods in applications is how to determine when it is safe to stop sampling and use the samples to estimate characteristics of the distribution of interest. Research into methods of computing theoretical convergence bounds holds promise for the future but currently has yielded relatively little that is of practical use in applied work. Consequently, most MCMC users address the convergence problem by applying diagnostic tools to the output produced by running their samplers. After giving a brief overview of the area, we provide an expository review of thirteen convergence diagnostics, describing the theoretical basis and practical implementation of each. We then compare their performance in two simple models and conclude that all the methods can fail to detect the sorts of convergence failure they were designed to identify. We thus recommend a combination of strategies aimed at evaluating and accelerating MCMC sampler conver...
Possible biases induced by MCMC convergence diagnostics
, 1997
"... This paper is organised as follows. In Section 2, we present an oversimplified version of a convergence diagnostic, and study analytically its performance on certain simple Markov chains. We restrict ourselves primarily to chains which in fact produce i.i.d. samples from ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
This paper is organised as follows. In Section 2, we present an oversimplified version of a convergence diagnostic, and study analytically its performance on certain simple Markov chains. We restrict ourselves primarily to chains which in fact produce i.i.d. samples from
Looking at Markov Samplers through Cusum Path Plots: a simple diagnostic idea
, 1994
"... In this paper, we propose to monitor a Markov chain sampler using the cusum path plot of a chosen 1dimensional summary statistic. We argue that the cusum path plot can bring out, more effectively than the sequential plot, those aspects of a Markov sampler which tell the user how quickly or slowly t ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
In this paper, we propose to monitor a Markov chain sampler using the cusum path plot of a chosen 1dimensional summary statistic. We argue that the cusum path plot can bring out, more effectively than the sequential plot, those aspects of a Markov sampler which tell the user how quickly or slowly the sampler is moving around in its sample space, in the direction of the summary statistic. The proposal is then illustrated in four examples which represent situations where the cusum path plot works well and not well. Moreover, a rigorous analysis is given for one of the examples. We conclude that the cusum path plot is an effective tool for convergence diagnostics of a Markov sampler and for comparing different Markov samplers. KEY WORDS: Convergence diagnostic; Cusum path plot, Markov sampler; Mixing; Sequential plot; Summary statistic. Research supported in part by ARO Grant DAAL0391G007. y Research supported in part by NSF Grant DMS9305601. 1 Introduction As Markov chain Mon...
Inference and Monitoring Convergence (chapter for Gilks, Richardson, and Spiegelhalter book)
"... this article we present yet another example, from our current applied research. Figure 0.1 displays an example of slow convergence from a Markov chain simulation for a hierarchical Bayesian model for a pharmacokinetics problem (see Bois et al., 1994, for details). The simulations were done using a M ..."
Abstract
 Add to MetaCart
this article we present yet another example, from our current applied research. Figure 0.1 displays an example of slow convergence from a Markov chain simulation for a hierarchical Bayesian model for a pharmacokinetics problem (see Bois et al., 1994, for details). The simulations were done using a Metropolisapproximate Gibbs sampler (as in Section 4.4 of Gelman, 1992); due to the complexity of the model, each iteration was expensive in computer time, and it was desirable to keep the simulation runs as short as possible. Figures 1a and 1b display time series plots for a single parameter in the posterior distribution in two independent simulations, each of length 1000. The simulations were run in parallel simultaneously on two workstations in a network. It is clear from the separation of the two sequences that, after 1000 iterations, the simulations are still far from convergence. However, either sequence alone looks perfectly well behaved.