Results 1  10
of
10
Computer Model Calibration using High Dimensional Output
"... This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the co ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the computer simulator as well as accounting for inadequate physics in the simulator. We consider applications in characterizing material properties for which the field data and the simulator output are highly multivariate. For example, the experimental data and simulation output may be an image or may describe the shape of a physical object. We make use of the basic framework of Kennedy and O’Hagan (2001). However, the size and multivariate nature of the data lead to computational challenges in implementing the framework. To overcome these challenges, we make use of basis representations (e.g. principal components) to reduce the dimensionality of the problem and speed up the computations required for exploring the posterior distribution. This methodology is applied to applications, both ongoing and historical, at Los Alamos National Laboratory.
Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems.
"... It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ensemble converges to one as both the sample size and the system dimension tends to infinity. Specifically, under fairly weak assumptions, if the ensemble size grows subexponentially in the cube root of the system dimension, the convergence holds for a single update step in statespace models with independent and identically distributed kernels. Further, in an important special case, more refined arguments show (and our simulations suggest) that the convergence to unity occurs unless the ensemble grows superexponentially in the system dimension. The weight singularity is also established in models with more general multivariate likelihoods, e.g. Gaussian and Cauchy. Although presented in the context of atmospheric data assimilation for numerical weather prediction, our results are generally valid for highdimensional particle filters. 1 1
2008b), ‘Data assimilation: Mathematical and statistical perspectives
 Internat. J. Numer. Methods Fluids
"... The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of th ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of these approaches are then introduced, leading to filtering (generalizing 3DVAR) and a form of smoothing (generalizing 4DVAR and weak constraint 4DVAR) and the optimization methods are shown to be maximum a posteriori estimators for the probability distributions implied by these statistical approaches; and (iii) by taking a general dynamical systems perspective on the subject it is shown that the incorporation of Lagrangian data can be handled by a straightforward extension of the preceding concepts. We argue that the smoothing approach to data assimilation, based on statistical analogues of 4DVAR and weak constraint 4DVAR, provides the optimal solution to the assimilation of space–time distributed data into a model. The optimal solution obtained is a probability distribution on the relevant class of functions (initial conditions or timedependent solutions). The approach is a useful one in the first instance because it clarifies the notion of what is the optimal solution, thereby providing a benchmark against which existing approaches can be evaluated. In the longer term it also provides the potential for new methods to create ensembles of solutions to the model, incorporating the available data in an optimal
Bayesian inverse problems for functions and applications to fluid mechanics. Inverse Problems 25
, 2009
"... Bayesian inverse problems for functions and applications to fluid mechanics This article has been downloaded from IOPscience. Please scroll down to see the full text article. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Bayesian inverse problems for functions and applications to fluid mechanics This article has been downloaded from IOPscience. Please scroll down to see the full text article.
A Statistical Perspective on Data Assimilation in Numerical Models
"... Introduction The phrase data assimilation is used in a variety of contexts in ocean and atmospheric studies, and the geosciences in general. Broadly speaking data assimilation involves combining observational data with a numerical results to yield a better prediction. It often has the sound of a \b ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Introduction The phrase data assimilation is used in a variety of contexts in ocean and atmospheric studies, and the geosciences in general. Broadly speaking data assimilation involves combining observational data with a numerical results to yield a better prediction. It often has the sound of a \black{ box procedure", where one puts in data at one end and it becomes \assimilated" as it comes out of the other. However, this is certainly not the case. One of our goals in this chapter will be to discuss the nature of data assimilation, in particular to dene it in a general sense, and to draw out many of the ideas in assimilation methods that are familiar to statisticians, but phrased in dierent ways. In this way we hope to fulll one of the objectives of this volume; to bridge gaps and to nd common ground between methods in statistics and the geosciences. Before the advent of extensive observational networks and technological advances that yield large amounts of
Approximating Posterior Distributions in Ensemble Forecasting
"... Ensemble forecasting is used in numerical weather prediction to give an improved estimate of the atmospheric state and to improve measures of forecast accuracy. While the method is e#ective, there are some fundamental issues in interpreting the ensemble as a statistically valid representation of unc ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Ensemble forecasting is used in numerical weather prediction to give an improved estimate of the atmospheric state and to improve measures of forecast accuracy. While the method is e#ective, there are some fundamental issues in interpreting the ensemble as a statistically valid representation of uncertainty in the state of the atmosphere. Coupled with this interpretation is the di#culty in the specification of large, complex covariance matrices used to combine a numerical forecast with observed data. We approach this problem by representing the prior distribution as a mixture of Gaussian distributions, then generate an ensemble from the posterior and use this ensemble to construct a kernel approximation to the posterior distribution.
IMS Collections
, 805
"... Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems ..."
Abstract
 Add to MetaCart
(Show Context)
Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems
doi:10.1017/S0962492910000061 Printed in the United Kingdom Inverse problems: A Bayesian perspective
"... The subject of inverse problems in differential equations is of enormous practical importance, and has also generated substantial mathematical and computational innovation. Typically some form of regularization is required to ameliorate illposed behaviour. In this article we review the Bayesian ..."
Abstract
 Add to MetaCart
The subject of inverse problems in differential equations is of enormous practical importance, and has also generated substantial mathematical and computational innovation. Typically some form of regularization is required to ameliorate illposed behaviour. In this article we review the Bayesian approach to regularization, developing a function space viewpoint on the subject. This approach allows for a full characterization of all possible solutions, and their relative probabilities, whilst simultaneously forcing significant modelling issues to be addressed in a clear and precise fashion. Although expensive to implement, this approach is starting to lie within the range of the available computational resources in many application areas. It also allows for the quantification of uncertainty and risk, something which is increasingly demanded by these applications. Furthermore, the approach is conceptually important for the understanding of simpler, computationally expedient approaches to inverse problems. We demonstrate that, when formulated in a Bayesian fashion, a wide range
Uncertainty in numerical weather forecasting via ensembles of initializations Yulia Gel,
, 2002
"... Due to the fact that the atmosphere is a chaotic system, any small error in the initial conditions may lead to increasing errors in the forecast, which eventually leads to a partial or total loss of predictive information. However, it may be possible to obtain some information on the predictability ..."
Abstract
 Add to MetaCart
Due to the fact that the atmosphere is a chaotic system, any small error in the initial conditions may lead to increasing errors in the forecast, which eventually leads to a partial or total loss of predictive information. However, it may be possible to obtain some information on the predictability by analyzing a set of ensembles of initializations (initial conditions) which are within a certain neighborhood of the true estimate of the atmosphere. Ensemble forecasting is useful for improving average forecasting accuracy, estimating the range and distribution of possible outcomes, and even
Computer Model Calibration Using High Dimensional Output
"... This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the co ..."
Abstract
 Add to MetaCart
(Show Context)
This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the computer simulator as well as accounting for inadequate physics in the simulator. The problem is complicated by the fact that simulation code is sufficiently demanding that only a limited number of simulations can be carried out. We consider applications in characterizing material properties for which the field data and the simulator output are highly multivariate. For example, the experimental data and simulation output may be an image or may describe the shape of a physical object. We make use of the basic framework of Kennedy and O’Hagan (2001). However, the size and multivariate nature of the data lead to computational challenges in implementing the framework. To overcome these challenges, we make use of basis representations (e.g. principal components) to reduce the dimensionality of the problem and speed up the computations required for exploring the posterior distribution. This methodology is applied to applications, both ongoing and historical, at Los Alamos National Laboratory.