Results 1  10
of
16
Bayesian model averaging
 STAT.SCI
, 1999
"... Standard statistical practice ignores model uncertainty. Data analysts typically select a model from some class of models and then proceed as if the selected model had generated the data. This approach ignores the uncertainty in model selection, leading to overcon dent inferences and decisions tha ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
Standard statistical practice ignores model uncertainty. Data analysts typically select a model from some class of models and then proceed as if the selected model had generated the data. This approach ignores the uncertainty in model selection, leading to overcon dent inferences and decisions that are more risky than one thinks they are. Bayesian model averaging (BMA) provides a coherent mechanism for accounting for this model uncertainty. Several methods for implementing BMA haverecently emerged. We discuss these methods and present anumber of examples. In these examples, BMA provides improved outofsample predictive performance. We also provide a catalogue of
Accounting for Model Uncertainty in Survival Analysis Improves Predictive Performance
 In Bayesian Statistics 5
, 1995
"... Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significanc ..."
Abstract

Cited by 39 (12 self)
 Add to MetaCart
Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significance tests to select a single model, and then to make inference conditionally on the selected model. However, this ignores model uncertainty, which can be substantial. We review the standard Bayesian model averaging solution to this problem and extend it to survival analysis, introducing partial Bayes factors to do so for the Cox proportional hazards model. In two examples, taking account of model uncertainty enhances predictive performance, to an extent that could be clinically useful. 1 Introduction From 1974 to 1984 the Mayo Clinic conducted a doubleblinded randomized clinical trial involving 312 patients to compare the drug DPCA with a placebo in the treatment of primary biliary cirrhosis...
Inference in longhorizon event studies: A bayesian approach with an application to initial public offerings
 Journal of Finance
, 2000
"... Statistical inference in longhorizon event studies has been hampered by the fact that abnormal returns are neither normally distributed nor independent. This study presents a new approach to inference that overcomes these difficulties and dominates other popular testing methods. I illustrate the us ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
Statistical inference in longhorizon event studies has been hampered by the fact that abnormal returns are neither normally distributed nor independent. This study presents a new approach to inference that overcomes these difficulties and dominates other popular testing methods. I illustrate the use of the methodology by examining the longhorizon returns of initial public offerings ~IPOs!. I find that the Fama and French ~1993! threefactor model is inconsistent with the observed longhorizon price performance of these IPOs, whereas a characteristicbased model cannot be rejected. RECENT EMPIRICAL STUDIES IN FINANCE document systematic longrun abnormal price reactions subsequent to numerous corporate activities. 1 Since these results imply that stock prices react with a long delay to publicly available information, they appear to be at odds with the Efficient Markets Hypothesis ~EMH!. Longrun event studies, however, are subject to serious statistical difficulties
Statistical Methods for Eliciting Probability Distributions
 Journal of the American Statistical Association
, 2005
"... Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticia ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
Elicitation is a key task for subjectivist Bayesians. While skeptics hold that it cannot (or perhaps should not) be done, in practice it brings statisticians closer to their clients and subjectmatterexpert colleagues. This paper reviews the stateoftheart, reflecting the experience of statisticians informed by the fruits of a long line of psychological research into how people represent uncertain information cognitively, and how they respond to questions about that information. In a discussion of the elicitation process, the first issue to address is what it means for an elicitation to be successful, i.e. what criteria should be employed? Our answer is that a successful elicitation faithfully represents the opinion of the person being elicited. It is not necessarily “true ” in some objectivistic sense, and cannot be judged that way. We see elicitation as simply part of the process of statistical modeling. Indeed in a hierarchical model it is ambiguous at which point the likelihood ends and the prior begins. Thus the same kinds of judgment that inform statistical modeling in general also inform elicitation of prior distributions.
Bayesian Variable Selection for Proportional Hazards Models
, 1996
"... The authors consider the problem of Bayesian variable selection for proportional hazards regression models with right censored data. They propose a semiparametric approach in which a nonparametric prior is specified for the baseline hazard rate and a fully parametric prior is specified for the regr ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
The authors consider the problem of Bayesian variable selection for proportional hazards regression models with right censored data. They propose a semiparametric approach in which a nonparametric prior is specified for the baseline hazard rate and a fully parametric prior is specified for the regression coe#cients. For the baseline hazard, they use a discrete gamma process prior, and for the regression coe#cients and the model space, they propose a semiautomatic parametric informative prior specification that focuses on the observables rather than the parameters. To implement the methodology, they propose a Markov chain Monte Carlo method to compute the posterior model probabilities. Examples using simulated and real data are given to demonstrate the methodology. R ESUM E Les auteurs abordent d'un point de vue bayesien le problemedelaselection de variables dans les modeles de regression des risques proportionnels en presence de censure a droite. Ils proposent une approche semip...
Accounting for inputmodel and inputparameter uncertainties in simulation
, 2004
"... To account for the inputmodel and inputparameter uncertainties inherent in many simulations as well as the usual stochastic uncertainty, we present a Bayesian inputmodeling technique that yields improved point and confidenceinterval estimators for a selected posterior mean response. Exploiting p ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
To account for the inputmodel and inputparameter uncertainties inherent in many simulations as well as the usual stochastic uncertainty, we present a Bayesian inputmodeling technique that yields improved point and confidenceinterval estimators for a selected posterior mean response. Exploiting prior information to specify the prior probabilities of the postulated input models and the associated prior inputparameter distributions, we use sample data to compute the posterior inputmodel and inputparameter distributions. Our Bayesian simulation replication algorithm involves: (i) estimating parameter uncertainty by randomly sampling the posterior inputparameter distributions; (ii) estimating stochastic uncertainty by running independent replications of the simulation using each set of inputmodel parameters sampled in (i); and (iii) estimating inputmodel uncertainty by weighting the responses generated in (ii) using the corresponding posterior inputmodel probabilities. Sampling effort is allocated among input models to minimize final pointestimator variance subject to a computingbudget constraint. A queueing simulation demonstrates the advantages of this approach.
Enhancing the Predictive Performance of Bayesian Graphical Models
 Communications in Statistics – Theory and Methods
, 1995
"... Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Baye ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Bayesian model averaging, a technique for accounting for model uncertainty. Second, we describe a technique for eliciting a prior distribution for competing models from domain experts. We explore the predictive performance of both techniques in the context of a urological diagnostic problem. KEYWORDS: Prediction; Bayesian graphical model; Bayesian network; Decomposable model; Model uncertainty; Elicitation. 1 Introduction Both statistical methods and knowledgebased systems are typically concerned with combining information from various sources to make inferences about prospective measurements. Inevitably, to combine information, we must make modeling assumptions. It follows that we should car...
Extending Conventional priors for Testing General Hypotheses
 Biometrika
, 2007
"... In this paper, we consider that observations Y come from a general normal linear model and that it is desired to test a simplifying (null) hypothesis about the parameters. We approach this problem from an objective Bayesian, model selection perspective. Crucial ingredients for this approach are ‘pro ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
In this paper, we consider that observations Y come from a general normal linear model and that it is desired to test a simplifying (null) hypothesis about the parameters. We approach this problem from an objective Bayesian, model selection perspective. Crucial ingredients for this approach are ‘proper objective priors ’ to be used for deriving the Bayes factors. JeffreysZellnerSiow priors have shown to have good properties for testing null hypotheses defined by specific values of the parameters in full rank linear models. We extend these priors to deal with general hypotheses in general linear models, not necessarily full rank. The resulting priors, which we call ‘conventional priors’, are expressed as a generalization of recently introduced ‘partially informative distributions’. The corresponding Bayes factors are fully automatic, easy to compute and very reasonable. The methodology is illustrated for two popular problems: the change point problem and the equality of treatments effects problem. We compare the conventional priors derived for these problems with other objective Bayesian proposals like the intrinsic priors. It is concluded that both priors behave similarly although interesting subtle differences arise. Finally, we accommodate the conventional priors to deal with non nested model selection as well as multiple model comparison.
Variable selection for multivariate logistic regression models
 Journal of Statistical Planning and Inference
, 2003
"... In this paper, we use multivariate logistic regression models to incorporate correlation among binary response data. Our objective is to develop a variable subset selection procedure to identify important covariates in predicting correlated binary responses using a Bayesian approach. In order to inc ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In this paper, we use multivariate logistic regression models to incorporate correlation among binary response data. Our objective is to develop a variable subset selection procedure to identify important covariates in predicting correlated binary responses using a Bayesian approach. In order to incorporate available prior information, we propose a class of informative prior distributions on the model parameters and on the model space. The propriety of the proposed informative prior is investigated in detail. Novel computational algorithms are also developed for sampling from the posterior distribution as well as for computing posterior model probabilities. Finally, a simulated data example and a real data example from a prostate cancer study are used to illustrate the proposed methodology.