Results 1  10
of
11
A framework for validation of computer models
, 2002
"... In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon Bayesian statistical methodology. The Bayesian methodology ..."
Abstract

Cited by 43 (12 self)
 Add to MetaCart
In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon Bayesian statistical methodology. The Bayesian methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and updating validation assessments as new information is acquired. Moreover, it allows inferential statements to be made about predictive error associated with model predictions in untested situations. The framework is implemented in two test bed models (a vehicle crash model and a resistance
Bayesian Validation of a Computer Model for Vehicle Collision
"... A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is pa ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and being able to adapt to different – but related – scenarios through hierarchical modeling. It is also shown how one can formally test if the computer model reproduces reality. The approach is illustrated through study of a computer model developed to model vehicle crashworthiness.
A Dynamic Modelling Strategy for Bayesian Computer Model Emulation
"... Abstract. Computer model evaluation studies build statistical models of deterministic simulationbased predictions of field data to then assess and criticize the computer model and suggest refinements. Computer models are often expensive computationally: statistical models that adequately emulate th ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. Computer model evaluation studies build statistical models of deterministic simulationbased predictions of field data to then assess and criticize the computer model and suggest refinements. Computer models are often expensive computationally: statistical models that adequately emulate their key features can be very much more efficient. Gaussian process models are often used as emulators, but the resulting computations lack the ability to scale to higherdimensional, timedependent or functional outputs. For some such problems, especially for contexts of time series outputs, building emulators using dynamic linear models provides a computationally attractive alternative as well as a flexible modelling approach capable of emulating a broad range of stochastic structures underlying the inputoutput simulations. We describe this here, combining Bayesian multivariate dynamic linear models with Gaussian process modelling in an effective manner, and illustrate the approach with data from a hydrological simulation model. The general strategy will be useful for other computer model evaluation studies with time series or functional outputs.
Sensitivity analysis in linear and nonlinear models: A review
"... Question: How do the inputs affect the outputs? ..."
www.niss.org Bayesian Validation of a Computer Model for Vehicle
, 2005
"... A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is pa ..."
Abstract
 Add to MetaCart
A key question in evaluation of computer models is Does the computer model adequately represent reality? A complete Bayesian approach to answering this question is developed for the challenging practical context in which the computer model (and reality) produce functional data. The methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and being able to adapt to different – but related – scenarios through hierarchical modeling. It is also shown how one can formally test if the computer model reproduces reality. The approach is illustrated through study of a computer model developed to model vehicle crashworthiness.
www.niss.org A Framework for Validation of Computer Models ∗
, 2005
"... In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon a mix of Bayesian statistical methodology and likelihood me ..."
Abstract
 Add to MetaCart
In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon a mix of Bayesian statistical methodology and likelihood methodology. The methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and updating validation assessments as new information is acquired. Moreover, it allows inferential statements to be made about predictive error associated with model predictions in untested situations. The framework is illustrated on two test bed models (a pedagogic example and a resistance spot weld model) that provide context for each of the six steps in the proposed validation process. ∗ David Higdon and Marc Kennedy were central to the development of an earlier version of this framework. 1
Bayesian Analysis of Computer Code Outputs: A Tutorial
, 2004
"... The Bayesian approach to quantifying, analysing and reducing uncertainty in the application of complex process models is attracting increasing attention amongst users of such models. The range and power of the Bayesian methods is growing and there is already a sizeable literature on these methods. H ..."
Abstract
 Add to MetaCart
The Bayesian approach to quantifying, analysing and reducing uncertainty in the application of complex process models is attracting increasing attention amongst users of such models. The range and power of the Bayesian methods is growing and there is already a sizeable literature on these methods. However, most of it is in specialist statistical journals. The purpose of this tutorial is to introduce the more general reader to the Bayesian approach.
A MATHEMATICAL AND COMPUTATIONAL FRAMEWORK FOR MULTIFIDELITY DESIGN AND ANALYSIS WITH COMPUTER MODELS
 INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION
, 2013
"... A multifidelity approach to design and analysis for complex systems seeks to exploit optimally all available models and data. Existing multifidelity approaches generally attempt to calibrate lowfidelity models or replace lowfidelity analysis results using data from higher fidelity analyses. This p ..."
Abstract
 Add to MetaCart
A multifidelity approach to design and analysis for complex systems seeks to exploit optimally all available models and data. Existing multifidelity approaches generally attempt to calibrate lowfidelity models or replace lowfidelity analysis results using data from higher fidelity analyses. This paper proposes a fundamentally different approach that uses the tools of estimation theory to fuse together information from multifidelity analyses, resulting in a Bayesianbased approach to mitigating risk in complex system design and analysis. This approach is combined with maximum entropy characterizations of model discrepancy to represent epistemic uncertainties due to modeling limitations and model assumptions. Mathematical interrogation of the uncertainty in system output quantities of interest is achieved via a variancebased global sensitivity analysis, which identifies the primary contributors to output uncertainty and thus provides guidance for adaptation of model fidelity. The methodology is applied to multidisciplinary design optimization and demonstrated on a wingsizing problem for a high altitude, long endurance vehicle.
greatly simplifies and improves our inf–sup lower bound construction (offline) and evaluation (online) – a critical ingredi
, 2006
"... particular (through deflation) for parameter values corresponding to nearly singular solution behavior. We apply the method to two illustrative problems: a coercive Laplacian heat conduction problem – which becomes singular as the heat transfer coefficient tends to zero; and a noncoercive Helmholtz ..."
Abstract
 Add to MetaCart
particular (through deflation) for parameter values corresponding to nearly singular solution behavior. We apply the method to two illustrative problems: a coercive Laplacian heat conduction problem – which becomes singular as the heat transfer coefficient tends to zero; and a noncoercive Helmholtz acoustics problem – which becomes singular as we approach resonance. In both cases, we observe very economical and sharp construction of the requisite naturalnorm inf–sup lower bound; rapid convergence of the reduced basis approximation; reasonable effectivities (even for nearsingular behavior) for our deflated output error estimators; and significant – several order of magnitude – (online) computational savings relative to standard finite element procedures. 2006 Elsevier Inc. All rights reserved.