## Analysis of the Gibbs sampler for a model related to James-Stein estimators (1995)

Citations: | 35 - 15 self |

### BibTeX

@MISC{Rosenthal95analysisof,

author = {Jeffrey S. Rosenthal},

title = {Analysis of the Gibbs sampler for a model related to James-Stein estimators},

year = {1995}

}

### Years of Citing Articles

### OpenURL

### Abstract

this paper we investigate the convergence properties of the Gibbs sampler as applied to a particular hierarchical Bayes model. The model is related to James-Stein estimators (James and Stein, 1961; Efron and Morris, 1973, 1975; Morris, 1983). Briefly, James-Stein estimators may be defined as the mean of a certain empirical Bayes posterior distribution (as discussed in the next section). We consider the problem of using the Gibbs sampler as a way of sampling from a richer posterior distribution, as suggested by Jun Liu (personal communication). Such a technique would eliminate the need to estimate a certain parameter empirically and to provide a "guess" at another one, and would give additional information about the distribution of the parameters involved. We consider, in particular, the convergence properties of this Gibbs sampler. For a certain range of prior distributions, we establish (Section 3) rigorous, numerical, reasonable rates of convergence. The bounds are obtained using the methods of Rosenthal (1995b). We thus rigorously bound the running time for this Gibbs sampler to converge to the posterior distribution, within a specified accuracy (as measured by total variation distance). We provide a general formula for this bound, which is of reasonable size, in terms of the prior distribution and the data. This Gibbs sampler is perhaps the most complicated example to date for which reasonable quantitative convergence rates have been obtained. We apply our bounds to the numerical baseball data of Efron and Morris (1975) and Morris (1983), based on batting averages of baseball players, and show that approximately 140 iterations are sufficient to achieve convergence in this case. For a different range of prior distributions, we use the Submartingale Convergence Theo...