Results 1  10
of
35
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems
, 2002
"... ..."
Uncertainty Analysis and other Inference Tools for Complex Computer Codes
, 1998
"... This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values o ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values of the input parameters are unknown. Given the distribution of the input parameters (which is often a subjective distribution derived from expert opinion), we wish to make inference about the implied distribution of the output. The computer code is sufficiently complex that the time to compute the output for any input configuration is substantial. The Bayesian approach was shown to improve dramatically on the classical approach, which is based on drawing a sample of values of the input parameters and thereby obtaining a sample from the output distribution. We review the basic Bayesian approach to the generic problem of inference for complex computer codes, and present some recent advancesinference about the distribution of quantile functions of the uncertainty distribution, calibration of models, and the use of runs of the computer code at different levels of complexity to make efficient use of the quicker, cruder, versions of the code. The emphasis is on practical applications. Keywords: COMPUTATIONAL EXPERIMENT; SIMULATION; GAUSSIAN PROCESS; SENSITIVITY ANALYSIS; UNCERTAINTY DISTRIBUTION; CALIBRATION; MULTILEVEL CODES; MODEL INADEQUACY. 1. INTRODUCTION 1.1. Complex computer codes
Sampling Strategies for Computer Experiments: Design and Analysis
, 2001
"... Computerbased simulation and analysis is used extensively in engineering for a variety of tasks. Despite the steady and continuing growth of computing power and speed, the computational cost of complex highfidelity engineering analyses and simulations limit their use in important areas like design ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Computerbased simulation and analysis is used extensively in engineering for a variety of tasks. Despite the steady and continuing growth of computing power and speed, the computational cost of complex highfidelity engineering analyses and simulations limit their use in important areas like design optimization and reliability analysis. Statistical approximation techniques such as design of experiments and response surface methodology are becoming widely used in engineering to minimize the computational expense of running such computer analyses and circumvent many of these limitations. In this paper, we compare and contrast five experimental design types and four approximation model types in terms of their capability to generate accurate approximations for two engineering applications with typical engineering behaviors and a wide range of nonlinearity. The first example involves the analysis of a twomember frame that has three input variables and three responses of interest. The second example simulates the rollover potential of a semitractortrailer for different combinations of input variables and braking and steering levels. Detailed error analysis reveals that uniform designs provide good sampling for generating accurate approximations using different sample sizes while kriging models provide accurate approximations that are robust for use with a variety of experimental designs and sample sizes.
Algorithmic construction of optimal symmetric Latin hypercube designs
 JOURNAL OF STATISTICAL PLANNING AND INFERENCES
, 2000
"... We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minim ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994) and Morris and Mitchell (1995). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
StateoftheArt Review: A User’s Guide to the Brave New World of Designing Simulation Experiments
 INFORMS Journal on Computing
, 2005
"... informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expe ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expertise who want to select a design and an appropriate analysis for their experiments. Furthermore, we provide a research agenda listing problems in the design of simulation experiments—as opposed to realworld experiments—that require more investigation. We consider three types of practical problems: (1) developing a basic understanding of a particular simulation model or system, (2) finding robust decisions or policies as opposed to socalled optimal solutions, and (3) comparing the merits of various decisions or policies. Our discussion emphasizes aspects that are typical for simulation, such as having many more factors than in realworld experiments, and the sequential nature of the data collection. Because the same problem type may be addressed through different design types, we discuss quality attributes of designs, such as the ease of design construction, the flexibility for analysis, and efficiency considerations. Moreover, the selection of the design type depends on the metamodel (response surface) that the analysts tentatively assume; for
www.niss.org Choosing the Sample Size of a Computer Experiment: A Practical Guide
, 2008
"... We produce reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension. Our arguments quantify two key characteristics of computer codes that affect the sample size required for a desired level of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We produce reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension. Our arguments quantify two key characteristics of computer codes that affect the sample size required for a desired level of accuracy when approximating the code via a Gaussian process (GP). The first characteristic is the total sensitivity of a code output variable to all input variables. The second corresponds to the way this total sensitivity is distributed across the input variables, specifically the possible presence of a few prominent input factors and many impotent ones (effect sparsity). Both measures relate directly to the correlation structure in the GP approximation of the code. In this way, the article moves towards a more formal treatment of sample size for a computer experiment. The evidence supporting these arguments stems primarily from a simulation study and via specific codes modeling climate and ligand activation of Gprotein.
Optimal experimental design and some related control problems
, 2008
"... This paper traces the strong relations between experimental design and control, such as the use of optimal inputs to obtain precise parameter estimation in dynamical systems and the introduction of suitably designed perturbations in adaptive control. The mathematical background of optimal experiment ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper traces the strong relations between experimental design and control, such as the use of optimal inputs to obtain precise parameter estimation in dynamical systems and the introduction of suitably designed perturbations in adaptive control. The mathematical background of optimal experimental design is briefly presented, and the role of experimental design in the asymptotic properties of estimators is emphasized. Although most of the paper concerns parametric models, some results are also presented for statistical learning and prediction with nonparametric models.
OrthogonalMaximin Latin Hypercube Designs
"... A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this a ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
A randomly generated Latin hypercube design (LHD) can be quite structured: the variables may be highly correlated or the design may not have good spacefilling properties. There are procedures to find good LHDs by minimizing the pairwise correlations or maximizing the intersite distances. In this article we have shown that these two criteria need not agree with each other. In fact, maximization of intersite distances can result in LHDs where the variables are highly correlated and vice versa. Therefore, we propose a multiobjective optimization approach to find good LHDs by combining correlation and distance performance measures. We also propose a new exchange algorithm for efficiently generating such designs. Several examples are presented to show that the new algorithm is fast and that the optimal designs are good in terms of both the correlation and distance criteria.
Bayesian Calibration of Complex Computer Models
"... We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical codes. Such models, implemented as computer codes, are often generic in the sense that by suitable choice of some of the model's input parameters the code can be used to predict behaviour of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical codes. Such models, implemented as computer codes, are often generic in the sense that by suitable choice of some of the model's input parameters the code can be used to predict behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for remaining uncertainty over ...