Results 1  10
of
31
Markov chains for exploring posterior distributions
 Annals of Statistics
, 1994
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 751 (6 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Probabilistic Inference Using Markov Chain Monte Carlo Methods
, 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difculties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. Rel ..."
Abstract

Cited by 567 (20 self)
 Add to MetaCart
Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difculties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical physics for over forty years, and, in the last few years, the related method of "Gibbs sampling" has been applied to problems of statistical inference. Concurrently, an alternative method for solving problems in statistical physics by means of dynamical simulation has been developed as well, and has recently been unified with the Metropolis algorithm to produce the "hybrid Monte Carlo" method. In computer science, Markov chain sampling is the basis of the heuristic optimization technique of "simulated annealing", and has recently been used in randomized algorithms for approximate counting of large sets. In this review, I outline the role of probabilistic inference in artificial intelligence, and present the theory of Markov chains, and describe various Markov chain Monte Carlo algorithms, along with a number of supporting techniques. I try to present a comprehensive picture of the range of methods that have been developed, including techniques from the varied literature that have not yet seen wide application in artificial intelligence, but which appear relevant. As illustrative examples, I use the problems of probabilitic inference in expert systems, discovery of latent classes from data, and Bayesian learning for neural networks.
Stochastic Volatility: Likelihood Inference And Comparison With Arch Models
, 1994
"... this paper we exploit Gibbs sampling to provide a likelihood framework for the analysis of stochastic volatility models, demonstrating how to perform either maximum likelihood or Bayesian estimation. The paper includes an extensive Monte Carlo experiment which compares the efficiency of the maximum ..."
Abstract

Cited by 354 (37 self)
 Add to MetaCart
this paper we exploit Gibbs sampling to provide a likelihood framework for the analysis of stochastic volatility models, demonstrating how to perform either maximum likelihood or Bayesian estimation. The paper includes an extensive Monte Carlo experiment which compares the efficiency of the maximum likelihood estimator with that of quasilikelihood and Bayesian estimators proposed in the literature. We also compare the fit of the stochastic volatility model to that of ARCH models using the likelihood criterion to illustrate the flexibility of the framework presented. Some key words: ARCH, Bayes estimation, Gibbs sampler, Heteroscedasticity, Maximum likelihood, Quasimaximum likelihood, Simulation, Stochastic EM algorithm, Stochastic volatility, Stock returns. 1 INTRODUCTION
Smoothing Spline Models for the Analysis of Nested and Crossed Samples of Curves
 Journal of the American Statistical Association
, 1998
"... We introduce a class of models for an additive decomposition of groups of curves strati ed by crossed and nested factors, generalizing smoothing splines to such samples by associating them with a corresponding mixed e ects model. The models are also useful for imputation of missing data and explorat ..."
Abstract

Cited by 81 (1 self)
 Add to MetaCart
We introduce a class of models for an additive decomposition of groups of curves strati ed by crossed and nested factors, generalizing smoothing splines to such samples by associating them with a corresponding mixed e ects model. The models are also useful for imputation of missing data and exploratory analysis of variance. We prove that the best linear unbiased predictors (BLUP) from the extended mixed e ects model correspond to solutions of a generalized penalized regression where smoothing parameters are directly related to variance components, and we show that these solutions are natural cubic splines. The model parameters are estimated using a highly e cient implementation of the EM algorithm for restricted maximum likelihood (REML) estimation based on a preliminary eigenvector decomposition. Variability of computed estimates can be assessed with asymptotic techniques or with a novel hierarchical bootstrap resampling scheme for nested mixed e ects models. Our methods are applied to menstrual cycle data from studies of reproductive function that measure daily urinary progesterone; the sample of progesterone curves is strati ed by cycles nested within subjects nested within conceptive and nonconceptive groups.
Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems
 Statistical Science
"... This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain method ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain methods. Each method is discussed giving an outline of the basic supporting theory and particular features of the technique. Conclusions are drawn concerning the relative merits of the methods based on the discussion and their application to three examples. The following broad recommendations are made. Asymptotic methods should only be considered in contexts where the integrand has a dominant peak with approximate ellipsoidal symmetry. Importance sampling, and preferably adaptive importance sampling, based on a multivariate Student should be used instead of asymptotics methods in such a context. Multiple quadrature, and in particular subregion adaptive integration, are the algorithms of choice for...
Blockrelaxation Algorithms in Statistics
, 1994
"... this paper we discuss four such classes of algorithms. Or, more precisely, we discuss a single class of algorithms, and we show how some wellknown classes of statistical algorithms fit in this common class. The subclasses are, in logical order, blockrelaxation methods augmentation methods majoriza ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
this paper we discuss four such classes of algorithms. Or, more precisely, we discuss a single class of algorithms, and we show how some wellknown classes of statistical algorithms fit in this common class. The subclasses are, in logical order, blockrelaxation methods augmentation methods majorization methods ExpectationMaximization Alternating Least Squares Alternating Conditional Expectations
Semiparametric Bayesian Analysis Of Survival Data
 Journal of the American Statistical Association
, 1996
"... this paper are motivated and aimed at analyzing some common types of survival data from different medical studies. We will center our attention to the following topics. ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
this paper are motivated and aimed at analyzing some common types of survival data from different medical studies. We will center our attention to the following topics.
An Iterative Monte Carlo Method for Nonconjugate Bayesian Analysis
 Statistics and Computing
, 1991
"... The Gibbs sampler has been proposed as a general method for Bayesian calculation in Gelfand and Smith (1990). However, the predominance of experience to date resides in applications assuming conjugacy where implementation is reasonably straightforward. This paper describes a tailored approximate rej ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
The Gibbs sampler has been proposed as a general method for Bayesian calculation in Gelfand and Smith (1990). However, the predominance of experience to date resides in applications assuming conjugacy where implementation is reasonably straightforward. This paper describes a tailored approximate rejection method approach for implementation of the Gibbs sampler when nonconjugate structure is present. Several challenging applications are presented for illustration.
Bayesian Analysis of Multivariate Survival Data Using Monte Carlo Methods
 Canadian Journal of Statistics
, 1995
"... This paper deals with the analysis of multivariate survival data from a Bayesian perspective using Markov Chain Monte Carlo methods. Metropolis along with Gibbs algorithm (Metropolis et al., 1953; Muller, 1991) is used to calculate some of the marginal posteriors. Multivariate survival model is prop ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
This paper deals with the analysis of multivariate survival data from a Bayesian perspective using Markov Chain Monte Carlo methods. Metropolis along with Gibbs algorithm (Metropolis et al., 1953; Muller, 1991) is used to calculate some of the marginal posteriors. Multivariate survival model is proposed since, survival times within the same `group' are correlated as a consequence of a frailty random block effect (Vaupel et al., 1979). The conditional proportional hazards model of Clayton and Cuzick (1985) is used with a martingale structured prior process (Arjas and Gasbarra, 1994) for the discretized baseline hazard. Besides the calculation of the marginal posteriors of the parameters of interest, this paper presents some Bayesian EDA diagnostic techniques to detect model adequacy. The methodology is exemplified with the kidney infection data where the times to infections within the same patients are expected to be correlated. Key Words: Autocorrelated prior process, credible regions,...
Alternatives to the Gibbs Sampling Scheme
, 1992
"... A variation of the Gibbs sampling scheme is defined by driving the simulated Markov chain by the conditional distributions of an approximation to the posterior rather than the posterior distribution itself. Choosing a multivariate normal mixture form for the approximation enables reparametrization w ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
A variation of the Gibbs sampling scheme is defined by driving the simulated Markov chain by the conditional distributions of an approximation to the posterior rather than the posterior distribution itself. Choosing a multivariate normal mixture form for the approximation enables reparametrization which is crucial to improve convergence in the Gibbs sampler. Using an approximation to the posterior density also opens the possiblity to include a learning process about the  in the operational sense of evaluating posterior integrals  unknown posterior density in the algorithm. While ideally this should be done using available pointwise evaluations of the posterior density, this is too difficult in a general framework and we use instead the currently available Monte Carlo sample to adjust the approximating density. This is done using a simple multivariate implementation of the mixture of Dirichlet density estimation algorithm. Keywords: Markov chain Monte Carlo, Bayesian sampling, stocha...