Results 11  20
of
50
Bayes linear calibrated prediction for complex systems
 Journal of the American Statistical Association
"... A calibrationbased approach is developed for predicting the behaviour of a physical system which is modelled by a computer simulator. The approach is based on Bayes linear adjustment using both system observations and evaluations of the simulator at parameterisations which appear to give good match ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
A calibrationbased approach is developed for predicting the behaviour of a physical system which is modelled by a computer simulator. The approach is based on Bayes linear adjustment using both system observations and evaluations of the simulator at parameterisations which appear to give good matches to those observations. This approach can be applied to complex highdimensional systems with expensive simulators, where a fullyBayesian approach would be impractical. It is illustrated with an example concerning the collapse of the Thermohaline Circulation (THC) in the Atlantic.
Smoothing Spline Analysis Of Variance For Polychotomous Response Data
, 1998
"... We consider the penalized likelihood method with smoothing spline ANOVA for estimating nonparametric functions to data involving a polychotomous response. The fitting procedure involves minimizing the penalized likelihood in a Reproducing Kernel Hilbert Space. One Step Block SORNewtonRaphson Algor ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We consider the penalized likelihood method with smoothing spline ANOVA for estimating nonparametric functions to data involving a polychotomous response. The fitting procedure involves minimizing the penalized likelihood in a Reproducing Kernel Hilbert Space. One Step Block SORNewtonRaphson Algorithm is used to solve the minimization problem. Generalized CrossValidation or unbiased risk estimation is used to empirically assess the amount of smoothing (which controls the bias and variance tradeoff) at each onestep Block SORNewtonRaphson iteration. Under some regular smoothness conditions, the onestep Block SORNewtonRaphson will produce a sequence which converges to the minimizer of the penalized likelihood for the fixed smoothing parameters. Monte Carlo simulations are conducted to examine the performance of the algorithm. The method is applied to polychotomous data from the Wisconsin Epidemiological Study of Diabetic Retinopathy to estimate the risks of causespecific mortality given several potential risk factors at the start of the study. Strategies to obtain smoothing spline estimates for large data sets with polychotomous response are also proposed in this thesis. Simulation studies are conducted to check the performance of the proposed method. ii Acknowledgements I would like to express my sincerest gratitude to my advisor, Professor Grace Wahba, for her invaluable advice during the course of this dissertation. Appreciation is extended to Professors Michael Kosorok, Mary Lindstrom, Olvi Mangasarian, and KamWah Tsui for their service on my final examination committee, their careful reading of this thesis and their valuable comments. I would like to thank Ronald Klein, MD and Barbara Klein, MD for providing the WESDR data. Fellow graduate students Fangy...
Algorithmic construction of optimal symmetric Latin hypercube designs
 JOURNAL OF STATISTICAL PLANNING AND INFERENCES
, 2000
"... We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minim ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994) and Morris and Mitchell (1995). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
Emergency control and its strategies
"... The objective of this paper is to discuss research trends in the context of power system emergency control. First, different possible strategies are discussed for the design of emergency control schemes. Then some new research directions are presented. The paper does not restrict its scope to a part ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
The objective of this paper is to discuss research trends in the context of power system emergency control. First, different possible strategies are discussed for the design of emergency control schemes. Then some new research directions are presented. The paper does not restrict its scope to a particular type of stability problem. Rather, it aims at providing a global view of emergency control and discusses the potential impact of new approaches. 1
Assessing Linearity in High Dimensions
, 2000
"... This paper presents a quasiregression method for determining the degree of linearity in a function, where the cost grows only as nd. A bias corrected version of quasiregression is able to estimate the degree of linearity with a sample size of order d ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
This paper presents a quasiregression method for determining the degree of linearity in a function, where the cost grows only as nd. A bias corrected version of quasiregression is able to estimate the degree of linearity with a sample size of order d
A multipoints criterion for deterministic parallel global optimization based on gaussian processes
 Journal of Global Optimization, in revision
, 2009
"... The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a measure of uncertainty, they proceed by iteratively choosing the point maximizing a criterion which is a compromise between predicted performance and uncertainty. Distributing the evaluation of such numerically expensive objective functions on many processors is an appealing idea. Here we investigate a multipoints optimization criterion, the multipoints expected improvement (qEI), aimed at choosing several points at the same time. An analytical expression of the qEI is given when q = 2, and a consistent statistical estimate is given for the general case. We then propose two classes of heuristic strategies meant to approximately optimize the qEI, and apply them to Gaussian Processes and to the classical BraninHoo testcase function. It is finally demonstrated within the covered example that the latter strategies perform as good as the best Latin Hypercubes and Uniform Designs ever found by simulation (2000 designs drawn at random for every q ∈ [1, 10]).
Uncertainty Quantification In Large Computational Engineering Models
 In Proceedings of the 42rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, number AIAA20011455
, 2001
"... While a wealth of experience in the development of uncertainty quantification methods and software tools exists at present, a cohesive software package utilizing massively parallel computing resources does not. The thrust of the work to be discussed herein is the development of such a toolkit, which ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
While a wealth of experience in the development of uncertainty quantification methods and software tools exists at present, a cohesive software package utilizing massively parallel computing resources does not. The thrust of the work to be discussed herein is the development of such a toolkit, which has leveraged existing software frameworks (e.g., DAKOTA (Design Analysis Kit for OpTimizAtion)) where possible, and has undertaken additional development efforts when necessary. The contributions of this paper are twofold. One, the design and structure of the toolkit from a software perspective will be discussed, detailing some of its distinguishing features. Second, the toolkit's capabilities will be demonstrated by applying a subset of its available uncertainty quantification techniques to an example problem involving multiple engineering disciplines, nonlinear solid mechanics and soil mechanics. This example problem will demonstrate the toolkit's suitability in quantifying uncertainty in engineering applications of interest modeled using very large computational system models.
Estimating Functions Evaluated by Simulation: a Bayesian/Analytic Approach
, 1997
"... Consider a function f : B ! R, where B is a compact subset of R m , and consider a "simulation" used to estimate f(x); x 2 B with the following properties: The simulation can switch from one x 2 B to another in zero time, and a simulation at x lasting t units of time yields a random variable with ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Consider a function f : B ! R, where B is a compact subset of R m , and consider a "simulation" used to estimate f(x); x 2 B with the following properties: The simulation can switch from one x 2 B to another in zero time, and a simulation at x lasting t units of time yields a random variable with mean f(x) and variance v(x)=t. With such a simulation we can divide T units of time into as many separate simulations as we like. Therefore, in principle we can design an "experiment" that spends ø (A) units of time simulating points in each A 2 B, where B is the Borel oefield on B and ø is an arbitrary finite measure on (B; B). We call a design specified by a measure ø a "generalized design". We propose an approximation for f based on the data from a generalized design. When ø is discrete, the approximation, f , reduces to a "Kriging" like estimator. We study discrete designs in detail, including asymptotics (as the length of the simulation increases) and a numerical procedure for findin...
Sliced spacefilling designs
 Biometrika
"... We propose an approach to constructing a new type of design, a sliced spacefilling design, intended for computer experiments with qualitative and quantitative factors. The approach starts with constructing a Latin hypercube design based on a special orthogonal array for the quantitative factors and ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
We propose an approach to constructing a new type of design, a sliced spacefilling design, intended for computer experiments with qualitative and quantitative factors. The approach starts with constructing a Latin hypercube design based on a special orthogonal array for the quantitative factors and then partitions the design into groups corresponding to different level combinations of the qualitative factors. The points in each group have good spacefilling properties. Some illustrative examples are given.