Results 11  20
of
58
On the Relationship Between Markov Chain Monte Carlo Methods for Model Uncertainty
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2001
"... This article considers Markov chain computational methods for incorporating uncertainty about the dimension of a parameter when performing inference within a Bayesian setting. A general class of methods is proposed for performing such computations, based upon a product space representation of the ..."
Abstract

Cited by 44 (4 self)
 Add to MetaCart
This article considers Markov chain computational methods for incorporating uncertainty about the dimension of a parameter when performing inference within a Bayesian setting. A general class of methods is proposed for performing such computations, based upon a product space representation of the problem which is similar to that of Carlin and Chib. It is shown that all of the existing algorithms for incorporation of model uncertainty into Markov chain Monte Carlo (MCMC) can be derived as special cases of this general class of methods. In particular, we show that the popular reversible jump method is obtained when a special form of MetropolisHastings (MH) algorithm is applied to the product space. Furthermore, the Gibbs sampling method and the variable selection method are shown to derive straightforwardly from the general framework. We believe that these new relationships between methods, which were until now seen as diverse procedures, are an important aid to the understanding of MCMC model selection procedures and may assist in the future development of improved procedures. Our discussion also sheds some light upon the important issues of "pseudoprior" selection in the case of the Carlin and Chib sampler and choice of proposal distribution in the case of reversible jump. Finally, we propose efficient reversible jump proposal schemes that take advantage of any analytic structure that may be present in the model. These proposal schemes are compared with a standard reversible jump scheme for the problem of model order uncertainty in autoregressive time series, demonstrating the improvements which can be achieved through careful choice of proposals
An empirical comparison of methods for forecasting using many predictors
, 2005
"... research assistance, and the referees for helpful suggestions. An earlier version of the theoretical results in this paper was circulated earlier under the title “An Empirical ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
research assistance, and the referees for helpful suggestions. An earlier version of the theoretical results in this paper was circulated earlier under the title “An Empirical
MCMC Methods for Computing Bayes Factors: A Comparative Review
 Journal of the American Statistical Association
, 2000
"... this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
this paper we review several of these methods, and subsequently compare them in the context of two examples, the first a simple regression example, and the second a much more challenging hierarchical longitudinal model of the kind often encountered in biostatistical practice. We find that the joint modelparameter space search methods perform adequately but can be difficult to program and tune, while the marginal likelihood methods are often less troublesome and require less in the way of additional coding. Our results suggest that the latter methods may be most appropriate for practitioners working in many standard model choice settings, while the former remain important for comparing large numbers of models, or models whose parameters cannot be easily updated in relatively few blocks. We caution however that all of the methods we compare require significant human and computer effort, suggesting that less formal Bayesian model choice methods may offer a more realistic alternative in many cases.
Bayesian wavelet regression on curves with application to a spectroscopic calibration problem
 Journal of the American Statistical Association
, 2001
"... Motivated by calibration problems in nearinfrared (N IR) spectroscopy, we consider the linear regression setting in which the many predictor variables arise from sampling an essentially continuous curve at equally spaced points and there may be multiple predictands. We tackle this regression proble ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
Motivated by calibration problems in nearinfrared (N IR) spectroscopy, we consider the linear regression setting in which the many predictor variables arise from sampling an essentially continuous curve at equally spaced points and there may be multiple predictands. We tackle this regression problem by calculating the wavelet transforms of the discretized curves, then applying a Bayesian variable selection method using mixture priors to the multivariate regression of predictands on wavelet coef � cients. For prediction purposes, we average over a set of likely models. Applied to a particular problem in N IR spectroscopy, this approach was able to � nd subsets of the wavelet coef � cients with overall better predictive performance than the more usual approaches. In the application, the available predictors are measurements of the N IR re � ectance spectrum of biscuit dough pieces at 256 equally spaced wavelengths. The aim is to predict the composition (i.e., the fat, � our, sugar, and water content) of the dough pieces using the spectral variables. Thus we have a multivariate regression of four predictands on 256 predictors with quite high intercorrelation among the predictors. A training set of 39 samples is available to � t this regression. Applying a wavelet transform replaces the 256 measurements on each spectrum with 256 wavelet coef � cients that carry the same information. The variable selection method could use subsets of these coef � cients that gave good predictions for all four compositional variables on a separate test set of samples. Selecting in the wavelet domain rather than from the original spectral variables is appealing in this application, because a single wavelet coef � cient can carry information from a band of wavelengths in the original spectrum. This band can be narrow or wide, depending on the scale of the wavelet selected.
A method for simultaneous variable selection and outlier identification in linear regression
 COMPUTATIONAL STATISTICS & DATA ANALYSIS
, 1996
"... ..."
Objective Bayesian variable selection
 Journal of the American Statistical Association 2006
, 2002
"... A novel fully automatic Bayesian procedure for variable selection in normal regression model is proposed. The procedure uses the posterior probabilities of the models to drive a stochastic search. The posterior probabilities are computed using intrinsic priors, which can be considered default priors ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
(Show Context)
A novel fully automatic Bayesian procedure for variable selection in normal regression model is proposed. The procedure uses the posterior probabilities of the models to drive a stochastic search. The posterior probabilities are computed using intrinsic priors, which can be considered default priors for model selection problems. That is, they are derived from the model structure and are free from tuning parameters. Thus, they can be seen as objective priors for variable selection. The stochastic search is based on a MetropolisHastings algorithm with a stationary distribution proportional to the model posterior probabilities. The procedure is illustrated on both simulated and real examples.
WaveletBased Nonparametric Bayes Methods
, 1998
"... In this chapter, we will provide an overview of the current status of research involving Bayesian inference in wavelet nonparametric problems. In many statistical applications, there is a need for procedures to (i) adapt to data and (ii) use prior information. The interface of wavelets and the Bayes ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
In this chapter, we will provide an overview of the current status of research involving Bayesian inference in wavelet nonparametric problems. In many statistical applications, there is a need for procedures to (i) adapt to data and (ii) use prior information. The interface of wavelets and the Bayesian paradigm provide a natural terrain for both of these goals.
Bayesian Adaptive Sampling for Variable Selection and Model Averaging
"... For the problem of model choice in linear regression, we introduce a Bayesian adaptive sampling algorithm (BAS), that samples models without replacement from the space of models. For problems that permit enumeration of all models BAS is guaranteed to enumerate the model space in 2 p iterations where ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
(Show Context)
For the problem of model choice in linear regression, we introduce a Bayesian adaptive sampling algorithm (BAS), that samples models without replacement from the space of models. For problems that permit enumeration of all models BAS is guaranteed to enumerate the model space in 2 p iterations where p is the number of potential variables under consideration. For larger problems where sampling is required, we provide conditions under which BAS provides perfect samples without replacement. When the sampling probabilities in the algorithm are the marginal variable inclusion probabilities, BAS may be viewed as sampling models “near ” the median probability model of Barbieri and Berger. As marginal inclusion probabilities are not known in advance we discuss several strategies to estimate adaptively the marginal inclusion probabilities within BAS. We illustrate the performance of the algorithm using simulated and real data and show that BAS can outperform Markov chain Monte Carlo methods. The algorithm is implemented in the R package BAS available at CRAN.