Results 1  10
of
35
Bayesian Experimental Design: A Review
 Statistical Science
, 1995
"... This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various ..."
Abstract

Cited by 171 (1 self)
 Add to MetaCart
This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various design criteria become part of a single, coherent approach.
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Bayesian Calibration of Computer Models
 Journal of the Royal Statistical Society, Series B, Methodological
, 2000
"... this paper a Bayesian approach to the calibration of computer models. We represent the unknown inputs as a parameter vector `. Using the observed data we derive the posterior distribution of `, which in particular quantifies the `residual uncertainty' about ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
this paper a Bayesian approach to the calibration of computer models. We represent the unknown inputs as a parameter vector `. Using the observed data we derive the posterior distribution of `, which in particular quantifies the `residual uncertainty' about
Bayesian Treed Gaussian Process Models with an Application to Computer Modeling
 Journal of the American Statistical Association
, 2007
"... This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian proce ..."
Abstract

Cited by 44 (15 self)
 Add to MetaCart
This paper explores nonparametric and semiparametric nonstationary modeling methodologies that couple stationary Gaussian processes and (limiting) linear models with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. Mixing between full Gaussian processes and simple linear models can yield a more parsimonious spatial model while significantly reducing computational effort. The methodological developments and statistical computing details which make this approach efficient are described in detail. Illustrations of our model are given for both synthetic and real datasets. Key words: recursive partitioning, nonstationary spatial model, nonparametric regression, Bayesian model averaging 1
Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems
, 2002
"... ..."
A framework for validation of computer models
, 2002
"... In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon Bayesian statistical methodology. The Bayesian methodology ..."
Abstract

Cited by 35 (11 self)
 Add to MetaCart
In this paper, we present a framework that enables computer model evaluation oriented towards answering the question: Does the computer model adequately represent reality? The proposed validation framework is a sixstep procedure based upon Bayesian statistical methodology. The Bayesian methodology is particularly suited to treating the major issues associated with the validation process: quantifying multiple sources of error and uncertainty in computer models; combining multiple sources of information; and updating validation assessments as new information is acquired. Moreover, it allows inferential statements to be made about predictive error associated with model predictions in untested situations. The framework is implemented in two test bed models (a vehicle crash model and a resistance
Uncertainty Analysis and other Inference Tools for Complex Computer Codes
, 1998
"... This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values o ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values of the input parameters are unknown. Given the distribution of the input parameters (which is often a subjective distribution derived from expert opinion), we wish to make inference about the implied distribution of the output. The computer code is sufficiently complex that the time to compute the output for any input configuration is substantial. The Bayesian approach was shown to improve dramatically on the classical approach, which is based on drawing a sample of values of the input parameters and thereby obtaining a sample from the output distribution. We review the basic Bayesian approach to the generic problem of inference for complex computer codes, and present some recent advancesinference about the distribution of quantile functions of the uncertainty distribution, calibration of models, and the use of runs of the computer code at different levels of complexity to make efficient use of the quicker, cruder, versions of the code. The emphasis is on practical applications. Keywords: COMPUTATIONAL EXPERIMENT; SIMULATION; GAUSSIAN PROCESS; SENSITIVITY ANALYSIS; UNCERTAINTY DISTRIBUTION; CALIBRATION; MULTILEVEL CODES; MODEL INADEQUACY. 1. INTRODUCTION 1.1. Complex computer codes
Sequential optimal design of neurophysiology experiments
, 2008
"... Adaptively optimizing experiments has the potential to significantly reduce the number of trials needed to build parametric statistical models of neural systems. However, application of adaptive methods to neurophysiology has been limited by severe computational challenges. Since most neurons are hi ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
Adaptively optimizing experiments has the potential to significantly reduce the number of trials needed to build parametric statistical models of neural systems. However, application of adaptive methods to neurophysiology has been limited by severe computational challenges. Since most neurons are high dimensional systems, optimizing neurophysiology experiments requires computing highdimensional integrations and optimizations in real time. Here we present a fast algorithm for choosing the most informative stimulus by maximizing the mutual information between the data and the unknown parameters of a generalized linear model (GLM) which we want to fit to the neuron’s activity. We rely on important logconcavity and asymptotic normality properties of the posterior to facilitate the required computations. Our algorithm requires only lowrank matrix manipulations and a 2dimensional search to choose the optimal stimulus. The average running time of these operations scales quadratically with the dimensionality of the GLM, making realtime adaptive experimental design feasible even for highdimensional stimulus and parameter spaces. For example, we
Design optimization of hierarchically decomposed multilevel system under uncertainty
 Proceedings of the ASME 2004 Design Engineering Technical Conferences, Salt Lake City, Utah, 28 September–2 October, DETC2004/DAC57357
, 2004
"... This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased desi ..."
Abstract

Cited by 16 (12 self)
 Add to MetaCart
This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased design constraints. We model the propagation of uncertainty throughout the multilevel hierarchy of elements that comprise the decomposed system by using the advanced mean value (AMV) method to generate the required probability distributions of nonlinear responses. We utilize appropriate metamodeling techniques for simulationbased design problems. A simple yet illustrative hierarchical bilevel engine design problem is used to demonstrate the proposed methodology. 1
Connection Between Uniformity and Aberration in Regular Fractions of Twolevel Factorials
 Biometrika
, 1999
"... We show a link between two apparently unrelated areas, namely uniformity and minimum aberration, both of which have been of substantial recent interest. Specifically, with reference to regular fractions of twolevel factorials, we derive an expression for the centered L 2 discrepancy measure for un ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
We show a link between two apparently unrelated areas, namely uniformity and minimum aberration, both of which have been of substantial recent interest. Specifically, with reference to regular fractions of twolevel factorials, we derive an expression for the centered L 2 discrepancy measure for uniformity in terms of the wordlength pattern. This result indicates, in particular, an excellent behavior of minimum aberration designs with regard to uniformity and provides further justification for the popular criterion of minimum aberration. Keywords: Centered L 2 discrepancy, fractional factorial designs, minimum aberration, uniformity, wordlength pattern. 1991 AMS Subject Classification: 62K15, 62K99 1. Introduction and Preliminaries In the context of fractional factorial plans, the criterion of minimum aberration has been of considerable recent interest; see Chen, Sun and Wu (1993) for an excellent review and Suen, Chen and Wu (1997) for more recent results and further references. T...