Results 1  10
of
47
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems
, 2002
"... ..."
Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations
, 2002
"... ..."
Uncertainty Analysis and other Inference Tools for Complex Computer Codes
, 1998
"... This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values o ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values of the input parameters are unknown. Given the distribution of the input parameters (which is often a subjective distribution derived from expert opinion), we wish to make inference about the implied distribution of the output. The computer code is sufficiently complex that the time to compute the output for any input configuration is substantial. The Bayesian approach was shown to improve dramatically on the classical approach, which is based on drawing a sample of values of the input parameters and thereby obtaining a sample from the output distribution. We review the basic Bayesian approach to the generic problem of inference for complex computer codes, and present some recent advancesinference about the distribution of quantile functions of the uncertainty distribution, calibration of models, and the use of runs of the computer code at different levels of complexity to make efficient use of the quicker, cruder, versions of the code. The emphasis is on practical applications. Keywords: COMPUTATIONAL EXPERIMENT; SIMULATION; GAUSSIAN PROCESS; SENSITIVITY ANALYSIS; UNCERTAINTY DISTRIBUTION; CALIBRATION; MULTILEVEL CODES; MODEL INADEQUACY. 1. INTRODUCTION 1.1. Complex computer codes
Design optimization of hierarchically decomposed multilevel system under uncertainty
 Proceedings of the ASME 2004 Design Engineering Technical Conferences, Salt Lake City, Utah, 28 September–2 October, DETC2004/DAC57357
, 2004
"... This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased desi ..."
Abstract

Cited by 16 (12 self)
 Add to MetaCart
This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased design constraints. We model the propagation of uncertainty throughout the multilevel hierarchy of elements that comprise the decomposed system by using the advanced mean value (AMV) method to generate the required probability distributions of nonlinear responses. We utilize appropriate metamodeling techniques for simulationbased design problems. A simple yet illustrative hierarchical bilevel engine design problem is used to demonstrate the proposed methodology. 1
Stryk. Hardwareintheloop optimization of the walking speed of a humanoid robot
 In CLAWAR 2006: 9th International Conference on Climbing and Walking Robots
"... Abstract — The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 whic ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Abstract — The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 which is equipped with a low dimensional parameterized walking trajectory generator, joint motor controller and an internal stabilization. The robot is included as hardwareintheloop to define a low dimensional blackbox optimization problem. In contrast to previously performed walking optimization approaches we apply a sequential surrogate optimization approach using stochastic approximation of the underlying objective function and sequential quadratic programming to search for a fast and stable walking motion. This is done under the conditions that only a small number of physical walking experiments should have to be carried out during the online optimization process. For the identified walking motion for the considered 55 cm tall humanoid robot we measured a forward walking speed of more than 30 cm/sec. With a modified version of the robot even more than 40 cm/sec could be achieved in permanent operation.
A DataAnalytic Approach to Bayesian Global Optimization
, 1997
"... this paper deals with the unconstrained global optimization problem, minimize f(x) where x = (x 1 ; : : : ; x k ). This includes the class of problems with simple constraints like a i x i b i , since these problems can be transformed to unconstrained global optimization problems. Throughout we ass ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
this paper deals with the unconstrained global optimization problem, minimize f(x) where x = (x 1 ; : : : ; x k ). This includes the class of problems with simple constraints like a i x i b i , since these problems can be transformed to unconstrained global optimization problems. Throughout we assume without loss of generality that the extremum of interest is a minimum.
Algorithmic construction of optimal symmetric Latin hypercube designs
 JOURNAL OF STATISTICAL PLANNING AND INFERENCES
, 2000
"... We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minim ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994) and Morris and Mitchell (1995). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
Adaptive Response Surface Method  A Global Optimization Scheme for Computationintensive Design Problems
 JOURNAL OF ENGINEERING OPTIMIZATION
, 2001
"... For design problems involving computationintensive analysis or simulation processes, approximation models are usually introduced to reduce computation time. Most approximationbased optimization methods make stepbystep improvements to the approximation model by adjusting the limits of the design ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
For design problems involving computationintensive analysis or simulation processes, approximation models are usually introduced to reduce computation time. Most approximationbased optimization methods make stepbystep improvements to the approximation model by adjusting the limits of the design variables. In this work, a new approximationbased optimization method for computationintensive design problems — the adaptive response surface method (ARSM), is presented. The ARSM creates quadratic approximation models for the computationintensive design objective function in a gradually reduced design space. The ARSM was designed to avoid being trapped by local optimum and to identify the global design optimum with a modest number of objective function evaluations. Extensive tests on the ARSM as a global optimization scheme using benchmark problems, as well as an industrial design application of the method, are presented. Advantages and limitations of the approach are also discussed.
www.niss.org Choosing the Sample Size of a Computer Experiment: A Practical Guide
, 2008
"... We produce reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension. Our arguments quantify two key characteristics of computer codes that affect the sample size required for a desired level of ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We produce reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension. Our arguments quantify two key characteristics of computer codes that affect the sample size required for a desired level of accuracy when approximating the code via a Gaussian process (GP). The first characteristic is the total sensitivity of a code output variable to all input variables. The second corresponds to the way this total sensitivity is distributed across the input variables, specifically the possible presence of a few prominent input factors and many impotent ones (effect sparsity). Both measures relate directly to the correlation structure in the GP approximation of the code. In this way, the article moves towards a more formal treatment of sample size for a computer experiment. The evidence supporting these arguments stems primarily from a simulation study and via specific codes modeling climate and ligand activation of Gprotein.