Results 1  10
of
17
Computer Experiments
, 1996
"... Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
Introduction Deterministic computer simulations of physical phenomena are becoming widely used in science and engineering. Computers are used to describe the flow of air over an airplane wing, combustion of gasses in a flame, behavior of a metal structure under stress, safety of a nuclear reactor, and so on. Some of the most widely used computer models, and the ones that lead us to work in this area, arise in the design of the semiconductors used in the computers themselves. A process simulator starts with a data structure representing an unprocessed piece of silicon and simulates the steps such as oxidation, etching and ion injection that produce a semiconductor device such as a transistor. A device simulator takes a description of such a device and simulates the flow of current through it under varying conditions to determine properties of the device such as its switching speed and the critical voltage at which it switches. A circuit simulator takes a list of devices and the
Comparison of response surface and Kriging models for multidisciplinary design optimization
 In: Seventh AIAA/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, AIAA
, 1998
"... In this paper, we compare and contrast the use of secondorder response surface models and kriging models for approximating nonrandom, deterministic computer analyses. After reviewing the response surface method for constructing polynomial approximations, kriging is presented as an alternative appr ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
In this paper, we compare and contrast the use of secondorder response surface models and kriging models for approximating nonrandom, deterministic computer analyses. After reviewing the response surface method for constructing polynomial approximations, kriging is presented as an alternative approximation method for the design and analysis of computer experiments. Both methods are applied to the multidisciplinary design of an aerospike nozzle which consists of a computational fluid dynamics model and a finiteelement model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations, and four optimization problems are formulated and solved using both sets of approximation
Algorithmic construction of optimal symmetric Latin hypercube designs
 JOURNAL OF STATISTICAL PLANNING AND INFERENCES
, 2000
"... We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minim ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We propose symmetric Latin hypercubes for designs of computer exepriment. The goal is to offer a compromise between computing effort and design optimality. The proposed class of designs has some advantages over the regular Latin hypercube design with respect to criteria such as entropy and the minimum intersite distance. An exchange algorithm is proposed for constructing optimal symmetric Latin hypercube designs. This algorithm is compared with two existing algorithms by Park (1994) and Morris and Mitchell (1995). Some examples, including a real case study in the automotive industry, are used to illustrate the performance of the new designs and the algorithms.
Handling Uncertainty in Analysis of Robust Design Experiments
 Journal of Quality Technology
, 1996
"... In the analysis of robust design experiments, a model is typically fit to experimental data, and then used to select levels of control variables that desensitize the response to uncontrollable variation. Usually, model uncertainty (and sometimes parameter uncertainty) is not formally accounted f ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In the analysis of robust design experiments, a model is typically fit to experimental data, and then used to select levels of control variables that desensitize the response to uncontrollable variation. Usually, model uncertainty (and sometimes parameter uncertainty) is not formally accounted for in the optimization process. This can lead to unrealistic improvements, and perhaps even suboptimal performance. This paper considers the use of Bayesian methods in the fitting of models and their subsequent optimization. Through the use of these methods, reliable assessments of uncertainty may be incorporated into the analysis of such data. Key Words: Bayesian Methods, Parameter and Model Uncertainty, Performance Measure, Model Selection, Response Model 1 Introduction The purpose of a robust design experiment is to adjust the parameters of a product or process so that it is insensitive to uncontrollable variation. The performance of products and processes is sensitive to variab...
Taguchi and Robust Optimization
, 1996
"... This report is intended to facilitate dialogue between engineers and optimizers about the efficiency of Taguchi methods for robust design, especially in the context of design by computer simulation. Three approaches to robust design are described: 1. Robust optimization, i.e. specifying an objective ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This report is intended to facilitate dialogue between engineers and optimizers about the efficiency of Taguchi methods for robust design, especially in the context of design by computer simulation. Three approaches to robust design are described: 1. Robust optimization, i.e. specifying an objective function f and then minimizing a smoothed (robust) version of f by the methods of numerical optimization. 2. Taguchi's method of specifying the objective function as a certain signaltonoise ratio, to be optimized by designing, performing and analyzing a single massive experiment. 3. Specifying an expected loss function f and then minimizing a cheaptocompute surrogate objective function f , to be obtained by designing and performing a single massive experiment. Some relations between these approaches are noted and it is emphasized that only the first approach is capable of iteratively progressing toward a solution. Adjunct Associate Professor, Department of Computational & Applied Mat...
An Efficient Robust Concept Exploration Method and Sequential Exploratory Experimental Design
, 2004
"... ..."
The application of multiobjective robust design methods in ship design
 Proceedings ICCAS’97, 10th International Conference on Computer Applications in Shipbuilding
, 1999
"... When designing large complex vessels, the evaluation of a particular design can be both complicated and time consuming. Designers often resort to the use of concept design models enabling both a reduction in complexity and time for evaluation. Various optimisation methods are then typically used to ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
When designing large complex vessels, the evaluation of a particular design can be both complicated and time consuming. Designers often resort to the use of concept design models enabling both a reduction in complexity and time for evaluation. Various optimisation methods are then typically used to explore the design space facilitating the selection of optimum or near optimum designs. It is now possible to incorporate considerations of seakeeping, stability and costs at the earliest stage in the ship design process. However, to ensure that reliable results are obtained, the models used are generally complex and computationally expensive. Methods have been developed which avoid the necessity to carry out an exhaustive search of the complete design space. One such method is described which is concerned with the application of the theory of Design Of Experiments (DOE) enabling the design space to be efficiently explored. The objective of the DOE stage is to produce response surfaces which can then be used by an optimisation module to search the design space. It is assumed that the concept exploration tool whilst being a simplification of the design problem, is still sufficiently complicated to enable reliable evaluations of a particular design concept. The response surface is used as a representation of the concept exploration tool, and by it’s nature can be used to rapidly evaluate a design concept hence reducing
Robust Parameter Design with Uncontrolled Noise Variables
"... this paper, we consider situations where the noise variables can vary but are observable. (In the computer/communication network applications, for example, it is possible to measure the various load conditions and so adjust for the changes in the noise variables.) We propose a general data analysis ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
this paper, we consider situations where the noise variables can vary but are observable. (In the computer/communication network applications, for example, it is possible to measure the various load conditions and so adjust for the changes in the noise variables.) We propose a general data analysis strategy which is based on modeling the responses directly. Our approach involves treating the noise variables as covariates and modeling both the location parameters and the (regression) coefficients as functions of the design factors. This allows us to determine the interactions between the design factors and the observed noise variables and to exploit them to reduce the effect of this source of variability. Variability due to unobserved noise variables can be identified by analyzing the squared residuals from the fitted model. The approach presented here is also applicable to experimental situations with covariates, where one has to remove the effects of these nuisance variables before identifying the important location and dispersion effects. The paper is organized as follows. In Section 2, we use a real application on thermal design of cabinets for telecommunications switching equipment to motivate the problem and the issues. Section 3 develops the underlying concepts and models for a single observed noise variable. The proposed data analysis strategy is outlined in Section 4 and is illustrated by applying it to the thermal design experiment. Section 5 deals with several generalizations, including the case of multiple noise variables. The direct modeling of responses as a function of design factors and noise variables has also been considered by Welch et al. (1990), Shoemaker et al. (1991), and Lucas (1990). Our approach simplifies to the formulations discussed by these ...
Optimal Compound Orthogonal Arrays and Single Arrays for Robust Parameter Design Experiments
"... Compound orthogonal arrays (COAs) and single arrays are alternatives to the innerouter arrays advocated by Taguchi for robust parameter design experiments. A criterion based on the wordtype patterns and strengths of COAs is proposed to select optimal COAs. Single arrays are classified into prodigal ..."
Abstract
 Add to MetaCart
Compound orthogonal arrays (COAs) and single arrays are alternatives to the innerouter arrays advocated by Taguchi for robust parameter design experiments. A criterion based on the wordtype patterns and strengths of COAs is proposed to select optimal COAs. Single arrays are classified into prodigal single arrays (PSAs) and economical single arrays (ESAs) according to their relative estimation capacities, and various optimality criteria again based on the wordtype patterns are proposed for selecting optimal single arrays. Useful optimal COAs, PSAs and ESAs are constructed and tabulated as convenient references for experimenters in practice. KEY WORDS: Robust parameter design, compound orthogonal array, single array 1
Modelling Conditional Variance Heterogeneity in Parameter Design
, 1996
"... This report is an addendum of the Engel and Huele (1996b) Technometrics paper. It contains a theorem that relates ordinary and weighted least squares estimation in the context of experimental design. It also gives some more background information of the simulation results from this Engel and Huele p ..."
Abstract
 Add to MetaCart
This report is an addendum of the Engel and Huele (1996b) Technometrics paper. It contains a theorem that relates ordinary and weighted least squares estimation in the context of experimental design. It also gives some more background information of the simulation results from this Engel and Huele paper and the SPlus code of the simulation is explained. keywords: dispersion effect, iteratively reweighted least squares, simulation, Taguchi methods, variance function estimation. A. Freek Huele is a consultant at IBIS UvA. He is a member of ASQC. Jan Engel is senior consultant at CQM bv. 1 Introduction The aim of this report is to give some more details about the Engel and Huele (1996b) approach to modelling the residual variance after fitting a response model on experimental data. In this section, first a short review is given about dispersion modelling in general and second the contents of this report is described. In statistical quality control, there is currently much interest in ...