Results 1 
6 of
6
Computer Model Calibration using High Dimensional Output
"... This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the co ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the computer simulator as well as accounting for inadequate physics in the simulator. We consider applications in characterizing material properties for which the field data and the simulator output are highly multivariate. For example, the experimental data and simulation output may be an image or may describe the shape of a physical object. We make use of the basic framework of Kennedy and O’Hagan (2001). However, the size and multivariate nature of the data lead to computational challenges in implementing the framework. To overcome these challenges, we make use of basis representations (e.g. principal components) to reduce the dimensionality of the problem and speed up the computations required for exploring the posterior distribution. This methodology is applied to applications, both ongoing and historical, at Los Alamos National Laboratory.
Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems.
"... It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ensemble converges to one as both the sample size and the system dimension tends to infinity. Specifically, under fairly weak assumptions, if the ensemble size grows subexponentially in the cube root of the system dimension, the convergence holds for a single update step in statespace models with independent and identically distributed kernels. Further, in an important special case, more refined arguments show (and our simulations suggest) that the convergence to unity occurs unless the ensemble grows superexponentially in the system dimension. The weight singularity is also established in models with more general multivariate likelihoods, e.g. Gaussian and Cauchy. Although presented in the context of atmospheric data assimilation for numerical weather prediction, our results are generally valid for highdimensional particle filters. 1 1
Data assimilation: Mathematical and statistical perspectives
 the International Journal for Numerical Methods in Fluids
, 2007
"... The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of th ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
The bulk of this paper contains a concise mathematical overview of the subject of data assimilation, highlighting three primary ideas: (i) the standard optimization approaches of 3DVAR, 4DVAR and weak constraint 4DVAR are described and their interrelations explained; (ii) statistical analogues of these approaches are then introduced, leading to filtering (generalizing 3DVAR) and a form of smoothing (generalizing 4DVAR and weak constraint 4DVAR) and the optimization methods are shown to be maximum a posteriori estimators for the probability distributions implied by these statistical approaches; (iii) by taking a general dynamical systems perspective on the subject it is shown that the incorporation of Lagrangian data can be handled by a straightforward extension of the preceding concepts. We argue that the smoothing approach to data assimilation, based on statistical analogues of 4DVAR and weak constraint 4DVAR, provides the optimal solution to the assimilation of spacetime distributed data into a model. The optimal solution obtained is a probability distribution on the relevant class of functions (initial conditions, or timedependent solutions). The approach is a useful one in the first instance because it clarifies the notion of what is the optimal solution, thereby providing a benchmark against which existing approaches can be evaluated. In the longer term it also provides the potential for new methods to create ensembles of solutions to the model, incorporating the available data in an optimal fashion. Two examples are given illustrating this approach to data assimilation, both in the context of Lagrangian data, one based on statistical 4DVAR and the other on weak constraint statistical 4DVAR. The former is compared with the ensemble Kalman filter which is thereby shown to be inaccurate in a variety of scenarios. Copyright c ○ 2008 John Wiley & Sons, Ltd.
A Statistical Perspective on Data Assimilation in Numerical Models
"... Introduction The phrase data assimilation is used in a variety of contexts in ocean and atmospheric studies, and the geosciences in general. Broadly speaking data assimilation involves combining observational data with a numerical results to yield a better prediction. It often has the sound of a \b ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Introduction The phrase data assimilation is used in a variety of contexts in ocean and atmospheric studies, and the geosciences in general. Broadly speaking data assimilation involves combining observational data with a numerical results to yield a better prediction. It often has the sound of a \black{ box procedure", where one puts in data at one end and it becomes \assimilated" as it comes out of the other. However, this is certainly not the case. One of our goals in this chapter will be to discuss the nature of data assimilation, in particular to dene it in a general sense, and to draw out many of the ideas in assimilation methods that are familiar to statisticians, but phrased in dierent ways. In this way we hope to fulll one of the objectives of this volume; to bridge gaps and to nd common ground between methods in statistics and the geosciences. Before the advent of extensive observational networks and technological advances that yield large amounts of
Approximating Posterior Distributions in Ensemble Forecasting
"... Ensemble forecasting is used in numerical weather prediction to give an improved estimate of the atmospheric state and to improve measures of forecast accuracy. While the method is e#ective, there are some fundamental issues in interpreting the ensemble as a statistically valid representation of unc ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Ensemble forecasting is used in numerical weather prediction to give an improved estimate of the atmospheric state and to improve measures of forecast accuracy. While the method is e#ective, there are some fundamental issues in interpreting the ensemble as a statistically valid representation of uncertainty in the state of the atmosphere. Coupled with this interpretation is the di#culty in the specification of large, complex covariance matrices used to combine a numerical forecast with observed data. We approach this problem by representing the prior distribution as a mixture of Gaussian distributions, then generate an ensemble from the posterior and use this ensemble to construct a kernel approximation to the posterior distribution.
IMS Collections
, 805
"... Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems ..."
Abstract
 Add to MetaCart
Curseofdimensionality revisited: Collapse of the particle filter in very large scale systems