Results 1  10
of
23
Assessment and Propagation of Model Uncertainty
, 1995
"... this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the ..."
Abstract

Cited by 108 (0 self)
 Add to MetaCart
this paper I discuss a Bayesian approach to solving this problem that has long been available in principle but is only now becoming routinely feasible, by virtue of recent computational advances, and examine its implementation in examples that involve forecasting the price of oil and estimating the chance of catastrophic failure of the U.S. Space Shuttle.
Regression Analysis of Multiple Protein Structures
, 1998
"... A general framework is presented for analyzing multiple protein structures. A family of related protein structures may be analyzed using statistical regression methods. The analysis requires alternating steps of finding correspondences among the protein structures and superimposing the corresponding ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
A general framework is presented for analyzing multiple protein structures. A family of related protein structures may be analyzed using statistical regression methods. The analysis requires alternating steps of finding correspondences among the protein structures and superimposing the corresponding landmarks. The superposition step may be performed using either affine or orthogonal transformations, thereby allowing protein structures to undergo either pure rotations or rotation plus shear operations. Regression analysis permits a separate weight for each position, allowing one to emphasize particular segments of a protein structure or to compensate for variances that differ at various positions in a structure. In addition, a novel method is introduced for finding an initial correspondence, based on matching discrete curvatures along the protein backbone. Another novel method is introduced for obtaining gap functions that adapt to the given data, thereby making dynamic programming meth...
A Case Study of Stochastic Optimization in Health Policy: Problem Formulation and Preliminary Results
 Journal of Global Optimization
, 2000
"... Abstract. We use Bayesian decision theory to address a variable selection problem arising in attempts to indirectly measure the quality of hospital care, by comparing observed mortality rates to expected values based on patient sickness at admission. Our method weighs data collection costs against p ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract. We use Bayesian decision theory to address a variable selection problem arising in attempts to indirectly measure the quality of hospital care, by comparing observed mortality rates to expected values based on patient sickness at admission. Our method weighs data collection costs against predictive accuracy to find an optimal subset of the available admission sickness variables. The approach involves maximizing expected utility across possible subsets, using Monte Carlo methods based on random division of the available data into N modeling and validation splits to approximate the expectation. After exploring the geometry of the solution space, we compare a variety of stochastic optimization methods — including genetic algorithms (GA), simulated annealing (SA), tabu search (TS), threshold acceptance (TA), and messy simulated annealing (MSA) — on their performance in finding good subsets of variables, and we clarify the role of N in the optimization. Preliminary results indicate that TS is somewhat better than TA and SA in this problem, with MSA and GA well behind the other three methods. Sensitivity analysis reveals broad stability of our conclusions.
A Distance Based Regression Model for Prediction with Mixed Data
 Communications in Statistics A. Theory and Methods
, 1990
"... A multiple regression method based on distance analysis and metric scaling is proposed and studied. This method allow us to predict a continuous response variable from several explanatory variables, is compatible with the general linear model and is found to be useful when the predictor variables ar ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
A multiple regression method based on distance analysis and metric scaling is proposed and studied. This method allow us to predict a continuous response variable from several explanatory variables, is compatible with the general linear model and is found to be useful when the predictor variables are both continuous and categorical. Real data examples are given to illustrate the results obtained. 1 Introduction Many authors have considered the problem in regression or multivariate analysis of having both qualitative and quantitative variables. Some procedures have been used in regression and association (Young et al. 1976; Daudin 1980; Roskam 1980; Lauritzen and Wermuth 1989), principal components (Young et al. 1978; Kiers 1989a , 1989b) and discriminant analysis Krzanowski (1975, 1986; Knoke 1982). The methodologies are mainly based on optimal scaling, generalized correlation coefficients, the location model and distancebased analysis. Although statistical analysis on mixed data is ...
Bayesian and likelihood methods for fitting multilevel models with complex level1 variation
, 2002
"... ..."
Scenario and Parametric Uncertainty in GESAMAC: A Methodological Study in Nuclear Waste Disposal Risk Assessment
, 1999
"... We examine a conceptual framework for accounting for all sources of uncertainty in complex prediction problems, involving six ingredients: past data, future observables, and scenario, structural, parametric, and predictive uncertainty. We apply this framework to nuclear waste disposal using a comput ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We examine a conceptual framework for accounting for all sources of uncertainty in complex prediction problems, involving six ingredients: past data, future observables, and scenario, structural, parametric, and predictive uncertainty. We apply this framework to nuclear waste disposal using a computer simulation environment GTMCHEMwhich "deterministically" models the onedimensional migration of radionuclides through the geosphere up to the biosphere. Focusing on scenario and parametric uncertainty, we show that mean predicted maximum doses to the man due to I \Gamma 129 and uncertainty bands around those predictions are larger when scenario uncertainty is properly assessed and propagated. We also illustrate the value of a new method for global sensitivity analysis of model output called extended FAST . Key words: Bayesian prediction, extended FAST, Level E/G test case, parametric uncertainty, scenario uncertainty, sensitivity analysis. 1 School of Mathematical Sciences, Unive...
Neural Networks and Logistic Regression
, 1996
"... this paper we investigated whether neural nets are worth to be considered as an alternative to logistic models in settings relevant for biomedical research. The first drawback of neural nets is that they give us no direct information on the value of a single covariate for the prediction. The example ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
this paper we investigated whether neural nets are worth to be considered as an alternative to logistic models in settings relevant for biomedical research. The first drawback of neural nets is that they give us no direct information on the value of a single covariate for the prediction. The examples in section 8 illustrate that there are no simple strategies to interpret the weights in this sense. It remains the question, whether even in those applications which focus on the estimation of the regression function, it is justified to neglect the possible gain in scientific knowledge available from the identification of influential factors. If we restrict ourselves to estimation of the regression function many biomedical applications involve less than 400 observations and/or more than five covariates. Larger samples are the exception, but the investigations in Section 10 reveal that neural networks need large samples to take advantage of their flexibility. Even if we have large samples, neural nets will be superior to the considered model selection procedures only if the true regression function cannot be approximated by a parsimonious member of the class (8.2). Hence it remains the question about the need of more complex regression functions in a particular biomedical application. In our opinion for most applications such complex functions are not very plausible, because the covariates represent meaningful biological factors. It should be emphasized that the successful use of neural networks appears mainly in fields like pattern recognition, where covariates like the grey scale of a pixel are more or less meaningless with respect to their single values and we can expect results only from combining the values of many pixels. As a final point we should mention that neural ...
The Use And Misuse Of Orthogonal Regression Estimation In Linear ErrorsInVariables Models
, 1994
"... Orthogonal regression is one of the standard linear regression methods to correct for the effects of measurement error in predictors. We argue that orthogonal regression is often misused in errorsinvariables linear regression, because of a failure to account for equation errors. The typical result ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Orthogonal regression is one of the standard linear regression methods to correct for the effects of measurement error in predictors. We argue that orthogonal regression is often misused in errorsinvariables linear regression, because of a failure to account for equation errors. The typical result is to overcorrect for measurement error, i.e., overestimate the slope, because equation error is ignored. The use of orthogonal regression must include a careful assessment of equation error, and not merely the usual (often informal) estimation of the ratio of measurement error variances. There are rarer instances, e.g., an example from geology discussed here, where the use of orthogonal regression without proper attention to modeling may lead to either overcorrection or undercorrection, depending on the relative sizes of the variances involved. Thus, our main point, which does not seem to be widely appreciated, is that orthogonal regression requires rather careful modeling of error. 1 INT...