## Rates of Convergence for Gibbs Sampling for Variance Component Models (1991)

Venue: | Ann. Stat |

Citations: | 38 - 10 self |

### BibTeX

@ARTICLE{Rosenthal91ratesof,

author = {Jeffrey S. Rosenthal},

title = {Rates of Convergence for Gibbs Sampling for Variance Component Models},

journal = {Ann. Stat},

year = {1991},

volume = {23},

pages = {740--761}

}

### Years of Citing Articles

### OpenURL

### Abstract

This paper analyzes the Gibbs sampler applied to a standard variance component model, and considers the question of how many iterations are required for convergence. It is proved that for K location parameters, with J observations each, the number of iterations required for convergence (for large K and J) is a constant times

### Citations

4060 |
Stochastic Relaxation, Gibbs Distribution and the Bayesian Restoration of Images
- Geman, Geman
- 1984
(Show Context)
Citation Context ...several years there has been a lot of attention given to the Gibbs Sampler algorithm for sampling from posterior distributions. This Markov chain Monte Carlo algorithm, popularized by Geman and Geman =-=[GG]-=- and summarized in [GS], has its roots in the Metropolis-Hastings algorithm ([MRRTT], [H]). It is closely related to the Data Augmentation algorithm of Tanner and Wong [TW]. It exploits the simplicity... |

2526 |
Equation of state calculations by fast computing machines
- Metropolis, Rosenbluth, et al.
- 1953
(Show Context)
Citation Context ...thm for sampling from posterior distributions. This Markov chain Monte Carlo algorithm, popularized by Geman and Geman [GG] and summarized in [GS], has its roots in the Metropolis-Hastings algorithm (=-=[MRRTT]-=-, [H]). It is closely related to the Data Augmentation algorithm of Tanner and Wong [TW]. It exploits the simplicity of certain conditional distributions to define a Markov chain that converges in law... |

1368 |
Monte Carlo sampling methods using Markov chains and their applications
- Hastings
- 1970
(Show Context)
Citation Context ...ampling from posterior distributions. This Markov chain Monte Carlo algorithm, popularized by Geman and Geman [GG] and summarized in [GS], has its roots in the Metropolis-Hastings algorithm ([MRRTT], =-=[H]-=-). It is closely related to the Data Augmentation algorithm of Tanner and Wong [TW]. It exploits the simplicity of certain conditional distributions to define a Markov chain that converges in law to t... |

905 |
Sampling-Based Approaches to Calculating Marginal Densities
- Gelfand, Smith
- 1990
(Show Context)
Citation Context ... been a lot of attention given to the Gibbs Sampler algorithm for sampling from posterior distributions. This Markov chain Monte Carlo algorithm, popularized by Geman and Geman [GG] and summarized in =-=[GS]-=-, has its roots in the Metropolis-Hastings algorithm ([MRRTT], [H]). It is closely related to the Data Augmentation algorithm of Tanner and Wong [TW]. It exploits the simplicity of certain conditional... |

849 | Markov chains for exploring posterior distributions (with discussion - Tierney - 1994 |

817 |
Inference from Iterative Simulation Using Multiple Sequences
- Gelman, Rubin
- 1992
(Show Context)
Citation Context ...e to use convergence diagnostics to check if the distribution after (say) 1000 steps is indeed close to the distribution to which the chain appears to converge; see [G], [Rob]. On the other hand, see =-=[GR]-=- for warnings about possible problems. In any case, it would be comforting to have theoretical results regarding how many iterations are required before the chain has in fact converged. There has been... |

646 |
The Calculation of Posterior Distributions by Data Augmentation
- Tanner, Wong
- 1987
(Show Context)
Citation Context ...ularized by Geman and Geman [GG] and summarized in [GS], has its roots in the Metropolis-Hastings algorithm ([MRRTT], [H]). It is closely related to the Data Augmentation algorithm of Tanner and Wong =-=[TW]-=-. It exploits the simplicity of certain conditional distributions to define a Markov chain that converges in law to the posterior distribution under consideration. Once the Markov chain has converged,... |

617 |
Applied Probabilities and Queues
- Asmussen
(Show Context)
Citation Context ...models with more complicated interdependencies, such as those arising in image processing. The proof of Theorem 1 employs a coupling argument (Lemma 2) related to the notion of Harris-recurrence (see =-=[A], [AN], [A-=-MN], [N]). This lemma is quite general, and reduces the study of convergence rates for Markov chains to the question of how much "overlap" there is between the multi-step transition probabil... |

541 |
Bayesian Inference in Statistical Analysis
- Box, Tiao
- 1992
(Show Context)
Citation Context ...s of a related "discretization" algorithm. In this paper we analyze the convergence rate of the variance component models as described in [GS], Section 3.4, and defined herein in Section 3. =-=(See also [BT] and [GH-=-RS].) Briefly, this model involves an overall location parameter ��, and K different parameters ` 1 ; : : : ; ` K which are normally distributed around ��. For each ` i there are J different o... |

271 |
Bayesian image restoration, with two applications in spatial statistics” (with discussion
- Besag, York, et al.
- 1991
(Show Context)
Citation Context ...n is how long the Markov chain must be run "until it converges". In most actual implementations of Gibbs sampling, this question is answered heuristically, as in "Let's run it 1000 time=-=s" (see, e.g., [BYM]-=-, p. 6). This may be risky since Gibbs sampling sometimes converges very slowly; see for example [M]. Now, it may be possible to use convergence diagnostics to check if the distribution after (say) 10... |

264 |
General Irreducible Markov Chains and Non-Negative Operators
- Nummelin
- 1984
(Show Context)
Citation Context ...omplicated interdependencies, such as those arising in image processing. The proof of Theorem 1 employs a coupling argument (Lemma 2) related to the notion of Harris-recurrence (see [A], [AN], [AMN], =-=[N]). This le-=-mma is quite general, and reduces the study of convergence rates for Markov chains to the question of how much "overlap" there is between the multi-step transition probabilities starting fro... |

224 | Constrained Monte Carlo maximum likelihood in dependent data - Geyer, Thompson - 1992 |

188 |
Illustration of Bayesian Inference in Normal Data Models Using Gibbs Sampling
- Gelfand, Hills, et al.
- 1990
(Show Context)
Citation Context ...lated "discretization" algorithm. In this paper we analyze the convergence rate of the variance component models as described in [GS], Section 3.4, and defined herein in Section 3. (See also=-= [BT] and [GHRS].) Brief-=-ly, this model involves an overall location parameter ��, and K different parameters ` 1 ; : : : ; ` K which are normally distributed around ��. For each ` i there are J different observations... |

145 |
Practical Markov chain Monte Carlo
- Geyer
- 1992
(Show Context)
Citation Context ...example [M]. Now, it may be possible to use convergence diagnostics to check if the distribution after (say) 1000 steps is indeed close to the distribution to which the chain appears to converge; see =-=[G]-=-, [Rob]. On the other hand, see [GR] for warnings about possible problems. In any case, it would be comforting to have theoretical results regarding how many iterations are required before the chain h... |

115 |
A new approach to the limit theory of recurrent Markov chains
- Athreya, Ney
- 1978
(Show Context)
Citation Context ...s with more complicated interdependencies, such as those arising in image processing. The proof of Theorem 1 employs a coupling argument (Lemma 2) related to the notion of Harris-recurrence (see [A], =-=[AN], [AMN], [-=-N]). This lemma is quite general, and reduces the study of convergence rates for Markov chains to the question of how much "overlap" there is between the multi-step transition probabilities ... |

77 |
Correlation structure and convergence rate of the Gibbs sampler (I): applications to the comparison of estimators and augmentation schemes
- Liu, WONG, et al.
- 1994
(Show Context)
Citation Context ...iterations are required before the chain has in fact converged. There has been limited analysis of this question to date (though it can be expected that there will be more in the future). In [SC] and =-=[LWK]-=-, general theorems about the functional form of the convergence are obtained, and it is shown that the convergence will often be geometric. However, no quantitative results regarding the convergence r... |

51 |
On the convergence of successive substitution sampling
- Schervish, Carlin
- 1992
(Show Context)
Citation Context ...how many iterations are required before the chain has in fact converged. There has been limited analysis of this question to date (though it can be expected that there will be more in the future). In =-=[SC]-=- and [LWK], general theorems about the functional form of the convergence are obtained, and it is shown that the convergence will often be geometric. However, no quantitative results regarding the con... |

26 | Rates of convergence for data augmentation on finite sample spaces. Annals of Applied Probability 3:819–39
- Rosenthal
- 1993
(Show Context)
Citation Context ...titative estimate of how large k should be to make the variation distance less than some ffl.) In [SC] several simple models are analyzed exactly, facilitating convergence results for these cases. In =-=[R1], qua-=-ntitative convergence rates are obtained for Data Augmentation for a two-step hierarchical model involving Bernoulli random variables. Also, see [AKP] for an interesting analysis of a related "di... |

25 |
Convergence Diagnostics of the Gibbs Sampler
- Roberts
- 1992
(Show Context)
Citation Context ...le [M]. Now, it may be possible to use convergence diagnostics to check if the distribution after (say) 1000 steps is indeed close to the distribution to which the chain appears to converge; see [G], =-=[Rob]-=-. On the other hand, see [GR] for warnings about possible problems. In any case, it would be comforting to have theoretical results regarding how many iterations are required before the chain has in f... |

20 |
On Coupling of Markov Chains
- Pitman
- 1976
(Show Context)
Citation Context ...on Distance and Coupling. Lemma 2 above provides a bound on the variation distance between two measures, using the coupling inequality. Coupling is widely used in Markov chain theory (see for example =-=[P]-=-, or Chapter 4E of [D]), but it may be less familiar to Statisticians. For completeness, we review it briefly here. Given probability measuress1 ands2 defined on the same probability space, the variat... |

19 |
Limit theorems for semi-Markov processes and renewal theory for Markov chains
- Athreya, McDonald, et al.
- 1978
(Show Context)
Citation Context ... more complicated interdependencies, such as those arising in image processing. The proof of Theorem 1 employs a coupling argument (Lemma 2) related to the notion of Harris-recurrence (see [A], [AN], =-=[AMN], [N]). Th-=-is lemma is quite general, and reduces the study of convergence rates for Markov chains to the question of how much "overlap" there is between the multi-step transition probabilities startin... |

16 |
Random Polynomial Time Algorithms for Sampling from Joint Distributions
- Applegate, Kannan, et al.
- 1990
(Show Context)
Citation Context ...ating convergence results for these cases. In [R1], quantitative convergence rates are obtained for Data Augmentation for a two-step hierarchical model involving Bernoulli random variables. Also, see =-=[AKP] for an in-=-teresting analysis of a related "discretization" algorithm. In this paper we analyze the convergence rate of the variance component models as described in [GS], Section 3.4, and defined here... |

15 |
A slowly mixing Markov chain with implications for Gibbs sam.~----- pling
- Matthews
- 1993
(Show Context)
Citation Context ...bs sampling, this question is answered heuristically, as in "Let's run it 1000 times" (see, e.g., [BYM], p. 6). This may be risky since Gibbs sampling sometimes converges very slowly; see fo=-=r example [M]-=-. Now, it may be possible to use convergence diagnostics to check if the distribution after (say) 1000 steps is indeed close to the distribution to which the chain appears to converge; see [G], [Rob].... |

5 | Rates of Convergence for Gibbs Sampler and Other Markov Chains - Rosenthal - 1992 |