Results 1  10
of
89
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 84 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Factorial Sampling Plans for Preliminary Computational Experiments
 Technometrics
, 1991
"... A computational mode / is a representation of some physical or other system of interest, first expressed mathematically and then implemented in the form of a computer program; it may be viewed as a function of inputs that, when evaluated, produces outputs. Motivation for this article comes from comp ..."
Abstract

Cited by 70 (0 self)
 Add to MetaCart
A computational mode / is a representation of some physical or other system of interest, first expressed mathematically and then implemented in the form of a computer program; it may be viewed as a function of inputs that, when evaluated, produces outputs. Motivation for this article comes from computational models that are deterministic, complicated enough to make classical mathematical analysis impractical and that have a moderatetolarge number of inputs. The problem of designing computational experiments to determine which inputs have important effects on an output is considered. The proposed experimental plans are composed of individually randomized onefactoratatime designs, and data analysis is based on the resulting random sample of observed elementary effects, those changes in an output due solely to changes in a particular input. Advantages of this approach include a lack of reliance on assumptions of relative sparsity of important inputs, monotonicity of outputs with respect to inputs, or adequacy of a loworder polynomial as an approximation to the computational model.
hypercube sampling and the propagation of uncertainty in analyses of complex systems, Reliability Engineering and System Safety 81
, 2003
"... ..."
Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations
, 2002
"... ..."
Uncertainty Analysis and other Inference Tools for Complex Computer Codes
, 1998
"... This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
This paper builds on work by Haylock and O'Hagan which developed a Bayesian approach to uncertainty analysis. The generic problem is to make posterior inference about the output of a complex computer code, and the specific problem of uncertainty analysis is to make inference when the "true" values of the input parameters are unknown. Given the distribution of the input parameters (which is often a subjective distribution derived from expert opinion), we wish to make inference about the implied distribution of the output. The computer code is sufficiently complex that the time to compute the output for any input configuration is substantial. The Bayesian approach was shown to improve dramatically on the classical approach, which is based on drawing a sample of values of the input parameters and thereby obtaining a sample from the output distribution. We review the basic Bayesian approach to the generic problem of inference for complex computer codes, and present some recent advancesinference about the distribution of quantile functions of the uncertainty distribution, calibration of models, and the use of runs of the computer code at different levels of complexity to make efficient use of the quicker, cruder, versions of the code. The emphasis is on practical applications. Keywords: COMPUTATIONAL EXPERIMENT; SIMULATION; GAUSSIAN PROCESS; SENSITIVITY ANALYSIS; UNCERTAINTY DISTRIBUTION; CALIBRATION; MULTILEVEL CODES; MODEL INADEQUACY. 1. INTRODUCTION 1.1. Complex computer codes
Design optimization of hierarchically decomposed multilevel system under uncertainty
 Proceedings of the ASME 2004 Design Engineering Technical Conferences, Salt Lake City, Utah, 28 Septemberâ€“2 October, DETC2004/DAC57357
, 2004
"... This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased desi ..."
Abstract

Cited by 21 (12 self)
 Add to MetaCart
(Show Context)
This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased design constraints. We model the propagation of uncertainty throughout the multilevel hierarchy of elements that comprise the decomposed system by using the advanced mean value (AMV) method to generate the required probability distributions of nonlinear responses. We utilize appropriate metamodeling techniques for simulationbased design problems. A simple yet illustrative hierarchical bilevel engine design problem is used to demonstrate the proposed methodology. 1
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
 in Design Theory and Methodology  DTM'C97
, 1997
"... Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design using orthogonal arrays to compute signal to noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design using orthogonal arrays to compute signal to noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling, whereby statistical models are built to approximate detailedcomputer analysis codes. Although computers continue to get faster, our analysis codes always seem to keep pace, so that their computational time remains nontrivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating codes across different domains. In this paper we first review metamodeling techniques that encompass the Design of Experiments, Response Surface Methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Hardwareintheloop optimization of the walking speed of a humanoid robot
 IN CLAWAR 2006: 9TH INTERNATIONAL CONFERENCE ON CLIMBING AND WALKING ROBOTS
, 2006
"... The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 which is equi ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 which is equipped with a low dimensional parameterized walking trajectory generator, joint motor controller and an internal stabilization. The robot is included as hardwareintheloop to define a low dimensional blackbox optimization problem. In contrast to previously performed walking optimization approaches we apply a sequential surrogate optimization approach using stochastic approximation of the underlying objective function and sequential quadratic programming to search for a fast and stable walking motion. This is done under the conditions that only a small number of physical walking experiments should have to be carried out during the online optimization process. For the identified walking motion for the considered 55 cm tall humanoid robot we measured a forward walking speed of more than 30 cm/sec. With a modified version of the robot even more than 40 cm/sec could be achieved in permanent operation.
Algorithmic construction of optimal symmetric Latin hypercube designs
 JOURNAL OF STATISTICAL PLANNING AND INFERENCES
, 2000
"... We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minim ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994) and Morris and Mitchell (1995). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
A DataAnalytic Approach to Bayesian Global Optimization
, 1997
"... this paper deals with the unconstrained global optimization problem, minimize f(x) where x = (x 1 ; : : : ; x k ). This includes the class of problems with simple constraints like a i x i b i , since these problems can be transformed to unconstrained global optimization problems. Throughout we ass ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
this paper deals with the unconstrained global optimization problem, minimize f(x) where x = (x 1 ; : : : ; x k ). This includes the class of problems with simple constraints like a i x i b i , since these problems can be transformed to unconstrained global optimization problems. Throughout we assume without loss of generality that the extremum of interest is a minimum.