Results 1 
9 of
9
Accounting for Model Uncertainty in Survival Analysis Improves Predictive Performance
 In Bayesian Statistics 5
, 1995
"... Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significanc ..."
Abstract

Cited by 39 (12 self)
 Add to MetaCart
Survival analysis is concerned with finding models to predict the survival of patients or to assess the efficacy of a clinical treatment. A key part of the modelbuilding process is the selection of the predictor variables. It is standard to use a stepwise procedure guided by a series of significance tests to select a single model, and then to make inference conditionally on the selected model. However, this ignores model uncertainty, which can be substantial. We review the standard Bayesian model averaging solution to this problem and extend it to survival analysis, introducing partial Bayes factors to do so for the Cox proportional hazards model. In two examples, taking account of model uncertainty enhances predictive performance, to an extent that could be clinically useful. 1 Introduction From 1974 to 1984 the Mayo Clinic conducted a doubleblinded randomized clinical trial involving 312 patients to compare the drug DPCA with a placebo in the treatment of primary biliary cirrhosis...
Toward evidencebased medical statistics. 2: The bayes factor
 Annals of Internal Medicine
, 1999
"... Bayesian inference is usually presented as a method for determining how scientific belief should be modified by data. Although Bayesian methodology has been one of the most active areas of statistical development in the past 20 years, medical researchers have been reluctant to embrace what they perc ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Bayesian inference is usually presented as a method for determining how scientific belief should be modified by data. Although Bayesian methodology has been one of the most active areas of statistical development in the past 20 years, medical researchers have been reluctant to embrace what they perceive as a subjective approach to data analysis. It is little understood that Bayesian methods have a databased core, which can be used as a calculus of evidence. This core is the Bayes factor, which in its simplest form is also called a likelihood ratio. The minimum Bayes factor is objective and can be used in lieu of the P value as a measure of the evidential strength. Unlike P values, Bayes factors have a sound theoretical foundation and an interpretation that allows their use in both inference and decision making. Bayes factors show that P values greatly overstate the evidence against the null hypothesis. Most important, Bayes factors require the addition of background knowledge to be transformed into inferences—probabilities that a given conclusion is right or wrong. They make the distinction clear between experimental evidence and inferential conclusions while providing a framework in which to combine prior with current evidence. This paper is also available at
Bayesian belief network model for the safety assessment of nuclear computerbased systems. Second year report part 2, Esprit Long Term Research Project 20072DeVa
, 1998
"... The formalism of Bayesian Belief Networks (BBNs) is being increasingly applied to probabilistic modelling and decision problems in a widening variety of fields. This method provides the advantages of a formal probabilistic model, presented in an easily assimilated visual form, together with the read ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
The formalism of Bayesian Belief Networks (BBNs) is being increasingly applied to probabilistic modelling and decision problems in a widening variety of fields. This method provides the advantages of a formal probabilistic model, presented in an easily assimilated visual form, together with the ready availability of efficient computational methods and tools for exploring model consequences. Here we formulate one BBN model of a part of the safety assessment task for computer and software based nuclear systems important to safety. Our model is developed from the perspective of an independent safety assessor who is presented with the task of evaluating evidence from disparate sources: the requirement specification and verification documentation of the system licensee and of the system manufacturer; the previous reputation of the various participants in the design process; knowledge of commercial pressures; information about tools and resources used; and many other sources. Based on these multiple sources of
Constructing Partial Prior Specifications for Models of Complex Physical Systems
, 1998
"... Many large scale problems, particularly in the physical sciences, are solved using complex high dimensional models whose outputs, for a given set of inputs, are expensive and timeconsuming to evaluate. The complexity of such problems forces us to focus attention on those limited aspects of uncer ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Many large scale problems, particularly in the physical sciences, are solved using complex high dimensional models whose outputs, for a given set of inputs, are expensive and timeconsuming to evaluate. The complexity of such problems forces us to focus attention on those limited aspects of uncertainty which are directly relevant to the tasks for which the model will be used. We discuss methods for constructing the relevant partial prior specifications for these uncertainties, based on the prior covariance structure. Our approach combines two sources of prior knowledge. First, we elicit both qualitative and quantitative prior information based on expert prior judgements, using computer based elicitation tools for organising the complex collection of assessments in a systematic way. Secondly, we test and refine these judgements using detailed experiments based on versions of the model which are cheaper to evaluate. While the approach is quite general, we illustrate it in the context of matching hydrocarbon reservoir history. KEYWORDS Bayes linear methods; computer experiments; elicitation tools; history matching; variable selection; covariance specification 1
Enhancing the Predictive Performance of Bayesian Graphical Models
 Communications in Statistics – Theory and Methods
, 1995
"... Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Baye ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Both knowledgebased systems and statistical models are typically concerned with making predictions about future observables. Here we focus on assessment of predictive performance and provide two techniques for improving the predictive performance of Bayesian graphical models. First, we present Bayesian model averaging, a technique for accounting for model uncertainty. Second, we describe a technique for eliciting a prior distribution for competing models from domain experts. We explore the predictive performance of both techniques in the context of a urological diagnostic problem. KEYWORDS: Prediction; Bayesian graphical model; Bayesian network; Decomposable model; Model uncertainty; Elicitation. 1 Introduction Both statistical methods and knowledgebased systems are typically concerned with combining information from various sources to make inferences about prospective measurements. Inevitably, to combine information, we must make modeling assumptions. It follows that we should car...
The Elicitation of Probabilities A Review of the Statistical Literature
, 2005
"... “We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
“We live in an uncertain world, and probability risk assessment deals as directly with that fact as anything we do. Uncertainty arises partly because we are fallible.
A New Parameterization for Pattern Mixture Models of Longitudinal Data With Informative Dropout
"... this paper, we introduce a new parameterization of the pattern mixture model which can be used to represent various missing data mechanisms and construct sensitivity analyses. The reformulated model is cast in terms of changes in location and scale between patterns. We consider PMM's which are mixt ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
this paper, we introduce a new parameterization of the pattern mixture model which can be used to represent various missing data mechanisms and construct sensitivity analyses. The reformulated model is cast in terms of changes in location and scale between patterns. We consider PMM's which are mixtures of multivariate normal distributions, where dropout defines pattern. For these models, it is relatively straightforward to separate the completedata parameter space into its identified and nonidentified subsets, and to derive identifying restrictions corresponding to MCAR, MAR, and to any NI mechanism. Sensitivity analyses follow by varying the nonidentified parameters, thus allowing the analyst to examine all nonignorable specifications along a (plausible) continuum. This parameterization allows separate investigation of assumptions about nonidentifiable parameters in both the mean and variance. In the next section, we describe the pattern mixture model for mixtures of normal data and discuss several aspects of identifiability and implied missing data mechanisms. In Section 3 we describe the new parameterization and derive representations of MCAR, MAR, and NI mechanisms for the pattern mixture model. Section 4 contains a detailed analysis of a recent clinical trial of recombinant human growth hormone (rhGH) for increasing muscle strength in the elderly, including several sensitivity analyses. In the final section we discuss several important points related to modeling incomplete data, and give recommendations for further topics of investigation. 2 The pattern mixture model for incomplete repeated measures data
Elicitation of Multivariate Prior Distributions: A nonparametric Bayesian approach
"... In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(·) about one or more uncertain quantities to represent a person’s knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been propose ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(·) about one or more uncertain quantities to represent a person’s knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite, Kadane and O’Hagan (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(·), instead we use nonparametric Bayesian inference, modelling f(·) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(·). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given.
Designing ELICITOR: Software to graphically elicit expert priors for logistic
"... regression models in ecology. ..."