Results 1  10
of
202
2005: Uncertainty in predictions of the climate response to rising levels of greenhouse gases
 Nature
"... The range of possibilities for future climate evolution 13 needs to be taken into account when planning climate change mitigation and adaptation strategies. This requires ensembles of multidecadal simulations to assess both chaotic climate variability and model response uncertainty As a first ste ..."
Abstract

Cited by 175 (9 self)
 Add to MetaCart
The range of possibilities for future climate evolution 13 needs to be taken into account when planning climate change mitigation and adaptation strategies. This requires ensembles of multidecadal simulations to assess both chaotic climate variability and model response uncertainty As a first step towards a probabilistic climate prediction system we have carried out a grand ensemble (an ensemble of ensembles) exploring uncertainty in a stateoftheart model. Uncertainty in model response is investigated using a perturbed physics ensemble 4 in which model parameters are set to alternative values considered plausible by experts in the relevant parameterization schemes 9 . Two or three values are taken for each parameter (see Methods); simulations may have several parameters perturbed from their standard model values simultaneously. For each combination of parameter values (referred to here as a 'model version') an initialcondition ensemble 22 is used, creating an ensemble of ensembles. Each individual member of this grand ensemble (referred to here as a 'simulation') explores the response to changing boundary conditions 22 by including a period with doubled CO 2 concentrations. The general circulation model (GCM) is a version of the Met Office Unified Model consisting of the atmospheric model HadAM3 23 , at standard resolution 9 but with increased numerical stability, coupled to a mixedlayer ocean. This allows us to explore the effects of a wide range of uncertainties in the way the atmosphere is represented, while avoiding a long spinup for each model version. Each simulation involves three 15year phases: (1) calibration, to deduce the ocean heatflux convergence field used in the subsequent phases; (2) control, used to quantify the relevance of the particular model version and heatflux convergence field; and The 2,578 simulations contain 2,017 unique simulations (duplicates are used to verify the experimental designsee Methods). The frequency distribution of the simulated climate sensitivities (see Methods) for the remaining model versions is shown in
Practical bayesian optimization of machine learning algorithms
, 2012
"... In this section we specify additional details of our Bayesian optimization algorithm which, for brevity, were omitted from the paper. For more detail, the code used in this work is made publicly available at ..."
Abstract

Cited by 130 (16 self)
 Add to MetaCart
(Show Context)
In this section we specify additional details of our Bayesian optimization algorithm which, for brevity, were omitted from the paper. For more detail, the code used in this work is made publicly available at
Probabilistic inference for future climate using an ensemble of climate model evaluations
 Climatic Change
, 2007
"... This paper describes an approach to computing probabilistic assessments of future climate, using a climate model. It clarifies the nature of probability in this context, and illustrates the kinds of judgements that must be made in order for such a prediction to be consistent with the probability cal ..."
Abstract

Cited by 63 (15 self)
 Add to MetaCart
This paper describes an approach to computing probabilistic assessments of future climate, using a climate model. It clarifies the nature of probability in this context, and illustrates the kinds of judgements that must be made in order for such a prediction to be consistent with the probability calculus. The climate model is seen as a tool for making probabilistic statements about climate itself, necessarily involving an assessment of the model’s imperfections. A climate event, such as a 2◦C increase in global mean temperature, is identified with a region of ‘climatespace’, and the ensemble of model evaluations is used within a numerical integration designed to estimate the probability assigned to that region.
2004, ‘Probabilistic formulations for transferring inferences from mathematical models to physical systems
 SIAM Journal on Scientific Computing. Forthcoming
"... Abstract. We outline a probabilistic framework for linking mathematical models to the physical systems that they represent, taking account of all sources of uncertainty including model and simulator imperfections. This framework is a necessary precondition for making probabilistic statements about ..."
Abstract

Cited by 50 (20 self)
 Add to MetaCart
(Show Context)
Abstract. We outline a probabilistic framework for linking mathematical models to the physical systems that they represent, taking account of all sources of uncertainty including model and simulator imperfections. This framework is a necessary precondition for making probabilistic statements about the system on the basis of evaluations of computer simulators. We distinguish simulators according to their quality and the nature of their inputs. Where necessary, we introduce further hypothetical simulators as modelling constructs to account for imperfections in the available simulators, and to unify the available simulators with the underlying system. Key words. Direct simulator, indirect simulator, top simulator, measurable inputs, tuning inputs, Bayesian inference, calibration, history matching, calibrated prediction, uncertainty analysis AMS subject classifications. 62F15 (Bayesian inference); 62P35 (Applications to physics); 62M20 (Prediction); 62M40 (Random fields) 1. Introduction. In a computer experiment (sometimes referred to as an in silico experiment) we make inferences about a physical system using a computer simulator of that system. Such computer experiments may be used to investigate problems for which it would be difficult to carry out the corresponding physical experiments.
Lightweight emulators for multivariate deterministic functions
 FORTHCOMING IN THE JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2007
"... An emulator is a statistical model of a deterministic function, to be used where the function itself is too expensive to evaluate withintheloop of an inferential calculation. Typically, emulators are deployed when dealing with complex functions that have large and heterogeneous input and output sp ..."
Abstract

Cited by 42 (9 self)
 Add to MetaCart
An emulator is a statistical model of a deterministic function, to be used where the function itself is too expensive to evaluate withintheloop of an inferential calculation. Typically, emulators are deployed when dealing with complex functions that have large and heterogeneous input and output spaces: environmental models, for example. In this challenging situation we should be sceptical about our statistical models, no matter how sophisticated, and adopt approaches that prioritise interpretative and diagnostic information, and the flexibility to respond. This paper presents one such approach, candidly rejecting the standard Smooth Gaussian Process approach in favour of a fullyBayesian treatment of multivariate regression which, by permitting sequential updating, allows for very detailed predictive diagnostics. It is argued directly and by illustration that the incoherence of such a treatment (which does not impose continuity on the model outputs) is more than compensated for by the wealth of available information, and the possibilities for generalisation.
The KnowledgeGradient Policy for Correlated Normal Beliefs
"... We consider a Bayesian ranking and selection problem with independent normal rewards and a correlated multivariate normal belief on the mean values of these rewards. Because this formulation of the ranking and selection problem models dependence between alternatives’ mean values, algorithms may util ..."
Abstract

Cited by 41 (20 self)
 Add to MetaCart
We consider a Bayesian ranking and selection problem with independent normal rewards and a correlated multivariate normal belief on the mean values of these rewards. Because this formulation of the ranking and selection problem models dependence between alternatives’ mean values, algorithms may utilize this dependence to perform efficiently even when the number of alternatives is very large. We propose a fully sequential sampling policy called the knowledgegradient policy, which is provably optimal in some special cases and has bounded suboptimality in all others. We then demonstrate how this policy may be applied to efficiently maximize a continuous function on a continuous domain while constrained to a fixed number of noisy measurements.
Reified bayesian modelling and inference for physical systems (with discussion
 Journal of Statistical Planning and Inference
, 2009
"... We describe an approach, termed reified analysis, for linking the behaviour of mathematical models with inferences about the physical systems which the models represent. We describe the logical basis for the approach, based on coherent assessment of the implications of deficiencies in the mathemat ..."
Abstract

Cited by 38 (18 self)
 Add to MetaCart
We describe an approach, termed reified analysis, for linking the behaviour of mathematical models with inferences about the physical systems which the models represent. We describe the logical basis for the approach, based on coherent assessment of the implications of deficiencies in the mathematical model. We show how the statistical analysis may be carried out by specifying stochastic relationships between the model that we have, improved versions of the model that we might construct, and the system itself. We illustrate our approach with an example concerning the potential shutdown of the Thermohaline Circulation in the Atlantic Ocean.
Measures of agreement between computation and experiment: Validation metrics
, 2006
"... With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables to sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric, as well as features that we believe should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent freeshear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.
A stochastic collocation approach to Bayesian inference in inverse problems
 Communications in computational physics 6
, 2009
"... Abstract. We present an efficient numerical strategy for the Bayesian solution of inverse problems. Stochastic collocation methods, based on generalized polynomial chaos (gPC), are used to construct a polynomial approximation of the forward solution over the support of the prior distribution. This a ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We present an efficient numerical strategy for the Bayesian solution of inverse problems. Stochastic collocation methods, based on generalized polynomial chaos (gPC), are used to construct a polynomial approximation of the forward solution over the support of the prior distribution. This approximation then defines a surrogate posterior probability density that can be evaluated repeatedly at minimal computational cost. The ability to simulate a large number of samples from the posterior distribution results in very accurate estimates of the inverse solution and its associated uncertainty. Combined with high accuracy of the gPCbased forward solver, the new algorithm can provide great efficiency in practical applications. A rigorous error analysis of the algorithm is conducted, where we establish convergence of the approximate posterior to the true posterior and obtain an estimate of the convergence rate. It is proved that fast (exponential) convergence of the gPC forward solution yields similarly fast (exponential) convergence of the posterior. The numerical strategy and the predicted convergence rates are then demonstrated on nonlinear inverse problems of
D (2006) Model error in weather and climate forecasting. In: Palmer T, Hagedorn R (eds) Predictability of weather and climate. Cambridge University Press, Cambridge Anderson JL (2001) An ensemble adjustment Kalman filter for data assimilation. Mon Weather
 Cliffs, NJ Bengtsson T, Snyder C, Nychka D
, 1999
"... “As if someone were to buy several copies of the morning newspaper to assure himself that what it said was true ” Ludwig Wittgenstein 1 ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
“As if someone were to buy several copies of the morning newspaper to assure himself that what it said was true ” Ludwig Wittgenstein 1