Results 1  10
of
71
Optimal Predictive Model Selection
 Ann. Statist
, 2002
"... Often the goal of model selection is to choose a model for future prediction, and it is natural to measure the accuracy of a future prediction by squared error loss. ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
Often the goal of model selection is to choose a model for future prediction, and it is natural to measure the accuracy of a future prediction by squared error loss.
Spike and slab variable selection: frequentist and bayesian strategies
 The Annals of Statistics
"... Variable selection in the linear regression model takes many apparent faces from both frequentist and Bayesian standpoints. In this paper we introduce a variable selection method referred to as a rescaled spike and slab model. We study the importance of prior hierarchical specifications and draw con ..."
Abstract

Cited by 40 (7 self)
 Add to MetaCart
Variable selection in the linear regression model takes many apparent faces from both frequentist and Bayesian standpoints. In this paper we introduce a variable selection method referred to as a rescaled spike and slab model. We study the importance of prior hierarchical specifications and draw connections to frequentist generalized ridge regression estimation. Specifically, we study the usefulness of continuous bimodal priors to model hypervariance parameters, and the effect scaling has on the posterior mean through its relationship to penalization. Several model selection strategies, some frequentist and some Bayesian in nature, are developed and studied theoretically. We demonstrate the importance of selective shrinkage for effective variable selection in terms of risk misclassification, and show this is achieved using the posterior from a rescaled spike and slab model. We also show how to verify a procedure’s ability to reduce model uncertainty in finite samples using a specialized forward selection strategy. Using this tool, we illustrate the effectiveness of rescaled spike and slab models in reducing model uncertainty. 1. Introduction. We
An exploration of aspects of Bayesian multiple testing
 Journal of Statistical Planning and Inference
, 2005
"... There has been increased interest of late in the Bayesian approach to multiple testing (often called the multiple comparisons problem), motivated by the need to analyze DNA microarray data in which it is desired to learn which of potentially several thousand genes are activated by a particular stimu ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
There has been increased interest of late in the Bayesian approach to multiple testing (often called the multiple comparisons problem), motivated by the need to analyze DNA microarray data in which it is desired to learn which of potentially several thousand genes are activated by a particular stimulus. We study the issue of prior specification for such multiple tests; computation of key posterior quantities; and useful ways to display these quantities. A decisiontheoretic approach is also considered.
2003), “Policy Evaluation in Uncertain Economic Environments (with discussion
 Brookings Papers on Economic Activity
"... It will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the sa ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
It will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the same statistical material? And anyhow, I suppose, if each had a different economist perched on his a priori, that would make a difference to the outcome. 1 This paper describes some approaches to macroeconomic policy evaluation in the presence of uncertainty about the structure of the economic environment under study. The perspective we discuss is designed to facilitate policy evaluation for several forms of uncertainty. For example, our approach may be used when an analyst is unsure about the appropriate economic theory that should be assumed to apply, or about the particular functional forms that translate a general theory into a form amenable to statistical analysis. As such, the methods we describe are, we believe, particularly useful in a range of macroeconomic contexts where fundamental disagreements exist as to the determinants of the problem under study. In addition, this approach recognizes that even if economists agree on the
On the effect of prior assumptions in Bayesian model averaging with applications to growth regression
, 2008
"... Abstract. We consider the problem of variable selection in linear regression models. Bayesian model averaging has become an important tool in empirical settings with large numbers of potential regressors and relatively limited numbers of observations. We examine the effect of a variety of prior assu ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
Abstract. We consider the problem of variable selection in linear regression models. Bayesian model averaging has become an important tool in empirical settings with large numbers of potential regressors and relatively limited numbers of observations. We examine the effect of a variety of prior assumptions on the inference concerning model size, posterior inclusion probabilities of regressors and on predictive performance. We illustrate these issues in the context of crosscountry growth regressions using three datasets with 41 to 67 potential drivers of growth and 72 to 93 observations. Finally, we recommend priors for use in this and related contexts.
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
Efficient empirical Bayes variable selection and estimation in linear models
 J. Amer. Statist. Assoc
, 2005
"... We propose an empirical Bayes method for variable selection and coefficient estimation in linear regression models. The method is based on a particular hierarchical Bayes formulation, and the empirical Bayes estimator is shown to be closely related to the LASSO estimator. Such a connection allows u ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
We propose an empirical Bayes method for variable selection and coefficient estimation in linear regression models. The method is based on a particular hierarchical Bayes formulation, and the empirical Bayes estimator is shown to be closely related to the LASSO estimator. Such a connection allows us to take advantage of the recently developed quick LASSO algorithm to compute the empirical Bayes estimate, and provides a new way to select the tuning parameter in the LASSO method. Unlike previous empirical Bayes variable selection methods, which in most practical situations can only be implemented through a greedy stepwise algorithm, our method gives a global solution efficiently. Simulations and real examples show that the proposed method is very competitive in terms of variable selection, estimation accuracy, and computation speed when compared with other variable selection and estimation methods.
Objective Bayesian variable selection
 Journal of the American Statistical Association 2006
, 2002
"... A novel fully automatic Bayesian procedure for variable selection in normal regression model is proposed. The procedure uses the posterior probabilities of the models to drive a stochastic search. The posterior probabilities are computed using intrinsic priors, which can be considered default priors ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
A novel fully automatic Bayesian procedure for variable selection in normal regression model is proposed. The procedure uses the posterior probabilities of the models to drive a stochastic search. The posterior probabilities are computed using intrinsic priors, which can be considered default priors for model selection problems. That is, they are derived from the model structure and are free from tuning parameters. Thus, they can be seen as objective priors for variable selection. The stochastic search is based on a MetropolisHastings algorithm with a stationary distribution proportional to the model posterior probabilities. The procedure is illustrated on both simulated and real examples.
Transdimensional Markov Chains: A Decade of Progress and Future Perspectives
 Journal of the American Statistical Association
, 2005
"... The last ten years have witnessed the development of sampling frameworks that permit the construction of Markov chains which simultaneously traverse both parameter and model space. In this time substantial methodological progress has been made. In this article we present a survey of the current stat ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
The last ten years have witnessed the development of sampling frameworks that permit the construction of Markov chains which simultaneously traverse both parameter and model space. In this time substantial methodological progress has been made. In this article we present a survey of the current state of the art and evaluate some of the most recent advances in this field. We also discuss future research perspectives in the context of the drive to develop sampling mechanisms with high degrees of both efficiency and automation. 1