Results 1 
9 of
9
The practical implementation of Bayesian model selection
 Institute of Mathematical Statistics
, 2001
"... In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty which is r ..."
Abstract

Cited by 84 (3 self)
 Add to MetaCart
In principle, the Bayesian approach to model selection is straightforward. Prior probability distributions are used to describe the uncertainty surrounding all unknowns. After observing the data, the posterior distribution provides a coherent post data summary of the remaining uncertainty which is relevant for model selection. However, the practical implementation of this approach often requires carefully tailored priors and novel posterior calculation methods. In this article, we illustrate some of the fundamental practical issues that arise for two different model selection problems: the variable selection problem for the linear model and the CART model selection problem.
Variable selection and Bayesian model averaging in casecontrol studies
, 1998
"... Covariate and confounder selection in casecontrol studies is most commonly carried out using either a twostep method or a stepwise variable selection method in logistic regression. Inference is then carried out conditionally on the selected model, but this ignores the model uncertainty implicit in ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Covariate and confounder selection in casecontrol studies is most commonly carried out using either a twostep method or a stepwise variable selection method in logistic regression. Inference is then carried out conditionally on the selected model, but this ignores the model uncertainty implicit in the variable selection process, and so underestimates uncertainty about relative risks. We report on a simulation study designed to be similar to actual casecontrol studies. This shows that pvalues computed after variable selection can greatly overstate the strength of conclusions. For example, for our simulated casecontrol studies with 1,000 subjects, of variables declared to be "significant" with pvalues between.01 and.05, only 49 % actually were risk factors when stepwise variable selection was used. We propose Bayesian model averaging as a formal way of taking account of model uncertainty in casecontrol studies. This yields an easily interpreted summary, the posterior probability that a variable is a risk factor, and our simulation study indicates this to be reasonably well calibrated in the situations simulated. The methods are applied and compared
Bayes Factors and BIC  Comment on “A Critique of the Bayesian Information Criterion for Model Selection”
, 1999
"... I would like to thank David L. Weakliem (1999 [this issue]) for a thoughtprovoking discussion of the basis of the Bayesian information criterion (BIC). We may be in closer agreement than one might think from reading his article. When writing about Bayesian model selection for social researchers, I ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
I would like to thank David L. Weakliem (1999 [this issue]) for a thoughtprovoking discussion of the basis of the Bayesian information criterion (BIC). We may be in closer agreement than one might think from reading his article. When writing about Bayesian model selection for social researchers, I focused on the BIC approximation on the grounds that it is easily implemented and often reasonable, and simplifies the exposition of an already technical topic. As Weakliem says, BIC corresponds to one of many possible priors, although I will argue that this prior is such as to make BIC appropriate for baseline reference use and reporting, albeit not necessarily always appropriate for drawing final conclusions. When writing about the same subject for statistical journals, however, I have paid considerable attention to the choice of priors for Bayes factors. I thank Weakliem for bringing this subtle but important topic to the attention of sociologists. In 1986, I proposed replacing P values by Bayes factors as the basis for hypothesis testing and model selection in social research, and I suggested BIC as a simple and convenient, albeit crude, approximation. Since then, a great deal has been learned about Bayes factors in general, and about BIC in particular. Weakliem seems to agree that the Bayes factor framework is a useful one for hypothesis testing and model selection; his concern is with how the Bayes factors are to be evaluated. Weakliem makes two main points about the BIC approximation. The first is that BIC yields an approximation to Bayes factors that corresponds closely to a particular prior (the unit information prior) on
Simultaneous Variable and Transformation Selection in Linear Regression
 JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 1995
"... We suggest a method for simultaneous variable and transformation selection based on posterior probabilities. A simultaneous approach avoids the problem that the order in which they are done might influence the choice of variables and transformations. The simultaneous approach also allows for conside ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
We suggest a method for simultaneous variable and transformation selection based on posterior probabilities. A simultaneous approach avoids the problem that the order in which they are done might influence the choice of variables and transformations. The simultaneous approach also allows for consideration of all possible models. We use a changepoint model, or "changepoint transformation", which often yields more interpretable models and transformations than the standard BoxTidwell approach. We also address the problem of model uncertainty in the selection of models. By averaging over models, we account for the uncertainty inherent in inference based on a single model chosen from the set of all possible models. We use a Markov chain Monte Carlo model composition (MC³) method which allows us to average over linear regression models when the space of all possible models is very large. This considers the selection of variables and transformations at the same time. In an example, we ...
Bayes Factors and BIC: Comment on Weakliem
, 1998
"... Weakliem agrees that Bayes factors are useful for model selection and hypothesis testing. He reminds us that the simple and convenient BIC approximation corresponds most closely to one particular prior on the parameter space, the unit information prior, and points out that researchers may have diffe ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Weakliem agrees that Bayes factors are useful for model selection and hypothesis testing. He reminds us that the simple and convenient BIC approximation corresponds most closely to one particular prior on the parameter space, the unit information prior, and points out that researchers may have different prior information or opinions. Clearly a prior that represents the available information should be used, although the unit information prior often seems reasonable in the absence of strong prior information. It seems that, among the Bayes factors likely to be used in practice, BIC is conservative in the sense of tending to provide less evidence for additional parameters or "effects". Thus if a Bayes factor based on additional prior information favors an effect, but BIC does not, the prior information is playing a crucial role and this should be made clear when the research is reported. BIC may well have a role as a baseline reference analysis to be provided in routine reporting of research results, perhaps along with Bayes factors based on other priors. In Weakliem's 2 x 2 table examples, BIC and Bayes factors based on Weakliem's preferred priors lead to similar substantive conclusions, but both differ from those based on P values. When there is additional prior information, the technology now exists to express it as
A Hierarchical Bayes Approach to Variable Selection for Generalized Linear Models
, 2004
"... For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be computed easi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be computed easily and quickly. The performance of these criteria is assessed via simulation and compared to other criteria such as AIC and BIC on normal, logistic and Poisson regression model classes. A Fully Bayes criterion based on a restricted region hyperprior seems to be the most promising.
Bayesian Variable Selection in Qualitative Models by KullbackLeibler projections
, 1998
"... The Bayesian variable selection method proposed in the paper is based on the evaluation of the KullbackLeibler distance between the full (or encompassing) model and the submodels. The implementation of the method does not require a separate prior modeling on the submodels since the corresponding pa ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The Bayesian variable selection method proposed in the paper is based on the evaluation of the KullbackLeibler distance between the full (or encompassing) model and the submodels. The implementation of the method does not require a separate prior modeling on the submodels since the corresponding parameters for the submodels are defined as the KullbackLeibler projections of the full model parameters. The result of the selection procedure is the submodel with the smallest number of covariates which is at an acceptable distance of the full model. We introduce the notion of explanatory power of a model and scale the maximal acceptable distance in terms of the explanatory power of the full model. Moreover, an additivity property between embedded submodels shows that our selection procedure is equivalent to select the submodel with the smallest number of covariates which has a sufficient explanatory power. We illustrate the performances of this method on a breast cancer dataset, where they...
ADAPTIVE BAYESIAN CRITERIA IN VARIABLE SELECTION FOR GENERALIZED LINEAR MODELS
"... Abstract: For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be com ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: For the problem of variable selection in generalized linear models, we develop various adaptive Bayesian criteria. Using a hierarchical mixture setup for model uncertainty, combined with an integrated Laplace approximation, we derive Empirical Bayes and Fully Bayes criteria that can be computed easily and quickly. The performance of these criteria is assessed via simulation and compared to other criteria such as AIC and BIC on normal, logistic and Poisson regression model classes. A Fully Bayes criterion based on a restricted region hyperprior seems to be the most promising. Finally, our criteria are illustrated and compared with competitors on a data example.
Metabolic Profile of Breast Cancer in a Population of Women in Southern
"... Abstract: Background: There are indications that mortality in breast cancer is related with dietary factors, but no study has been large enough to characterise reliably how, this risk is influenced. To establish a logistic regression equation that would predict breast cancer from factors in the endo ..."
Abstract
 Add to MetaCart
Abstract: Background: There are indications that mortality in breast cancer is related with dietary factors, but no study has been large enough to characterise reliably how, this risk is influenced. To establish a logistic regression equation that would predict breast cancer from factors in the endocrinological and metabolic profile, we studied endocrinological and metabolic risk factors that are modified by the diet, in a population of women with breast cancer in southern Spain. Patients and Methods: We carried out a simple a casecontrol study comparing 204 women with breast cancer (96 premenopausal and 108 postmenopausal women) and 250 healthy control subjects. The predictive variables were basal glycaemia, insulin, glycosylated haemoglobin (HbA1c), Cpeptide, insulinlike growth factorI (IGFI), total cholesterol, triglycerides, high density lipoproteinc (HDLC), low density lipoproteinc (LDLC), selenium and Quetelet index