Results 1  10
of
13
Numerical optimization using computer experiments
 Institute for Computer
, 1997
"... Engineering design optimization often gives rise to problems in which expensive objective functions are minimized by derivativefree methods. We propose a method for solving such problems that synthesizes ideas from the numerical optimization and computer experiment literatures. Our approach relies ..."
Abstract

Cited by 31 (9 self)
 Add to MetaCart
Engineering design optimization often gives rise to problems in which expensive objective functions are minimized by derivativefree methods. We propose a method for solving such problems that synthesizes ideas from the numerical optimization and computer experiment literatures. Our approach relies on kriging known function values to construct a sequence of surrogate models of the objective function that are used to guide a grid search for a minimizer. Results from numerical experiments on a standard test problem are presented.
Adaptive Experimental Design For Construction Of Response Surface Approximations
, 2001
"... Sequential Approximate Optimization (SAO) is a class of methods available for the multidisciplinary design optimization (MDO) of complex systems that are composed of several disciplines coupled together. One of the approaches used for SAO, is based on a quadratic response surface approximation, wher ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
Sequential Approximate Optimization (SAO) is a class of methods available for the multidisciplinary design optimization (MDO) of complex systems that are composed of several disciplines coupled together. One of the approaches used for SAO, is based on a quadratic response surface approximation, where zero and first order information are required. In these methods, designers must generate and query a database of order O(n²) in order to compute the second order terms of the quadratic response surface approximation. As the number of design variables grows, the computational cost of generating the required database becomes a concern. In this paper, we present an new approach in which we require just O(n) parameters for constructing a second order approximation. This is accomplished by transforming the matrix of second order terms into the canonical form. The method periodically requires an order O(n²) update of the second order approximation to maintain accuracy. Results show
Reduced Sampling For Construction Of Quadratic Response Surface Approximations Using Adaptive Experimental Design
, 2002
"... Applying nonlinear optimization strategies directly to complex multidisciplinary systems can be prohibitive when the complexity of the simulation codes is large. Increasingly, response surface approximations(RSAs), and specifically quadratic approximations, are being integrated with nonlinear optimi ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
(Show Context)
Applying nonlinear optimization strategies directly to complex multidisciplinary systems can be prohibitive when the complexity of the simulation codes is large. Increasingly, response surface approximations(RSAs), and specifically quadratic approximations, are being integrated with nonlinear optimizers in order to reduce the CPU time required for the optimization of complex multidisciplinary systems. RSAs provide a computationally inexpensive lower fidelity representation of the system performance. The curse of dimensionality is a major drawback in the implementation of these approximations as the amount of required data grows quadratically with the number of design variables.
Achieving robust design from computer simulations. Quality Technology and Quantitative Management
, 2006
"... Abstract: Computer simulations are widely used during product development. In particular, computer experiments are often conducted in order to optimize both product and process performance while respecting constraints that may be imposed. Several methods for achieving robust design in this context a ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Abstract: Computer simulations are widely used during product development. In particular, computer experiments are often conducted in order to optimize both product and process performance while respecting constraints that may be imposed. Several methods for achieving robust design in this context are described and compared with the aid of a simple example problem. The methods presented compare classical as well as modern approaches and introduce the idea of a ‘stochastic response ’ to aid the search for robust solutions. Emphasis is placed on the efficiency of each method with respect to computational cost and the ability to formulate objectives that encapsulate the notion of robustness.
Taguchi and Robust Optimization
, 1996
"... This report is intended to facilitate dialogue between engineers and optimizers about the efficiency of Taguchi methods for robust design, especially in the context of design by computer simulation. Three approaches to robust design are described: 1. Robust optimization, i.e. specifying an objective ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This report is intended to facilitate dialogue between engineers and optimizers about the efficiency of Taguchi methods for robust design, especially in the context of design by computer simulation. Three approaches to robust design are described: 1. Robust optimization, i.e. specifying an objective function f and then minimizing a smoothed (robust) version of f by the methods of numerical optimization. 2. Taguchi's method of specifying the objective function as a certain signaltonoise ratio, to be optimized by designing, performing and analyzing a single massive experiment. 3. Specifying an expected loss function f and then minimizing a cheaptocompute surrogate objective function f , to be obtained by designing and performing a single massive experiment. Some relations between these approaches are noted and it is emphasized that only the first approach is capable of iteratively progressing toward a solution. Adjunct Associate Professor, Department of Computational & Applied Mat...
The Cost of Numerical Integration in Statistical Decisiontheoretic Methods for Robust Design Optimization
"... Abstract: The Bayes principle from statistical decision theory provides a conceptual framework for quantifying uncertainties that arise in robust design optimization. The difficulty with exploiting this framework is computational, as it leads to objective and constraint functions that must be evalua ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract: The Bayes principle from statistical decision theory provides a conceptual framework for quantifying uncertainties that arise in robust design optimization. The difficulty with exploiting this framework is computational, as it leads to objective and constraint functions that must be evaluated by numerical integration. Using a prototypical robust design optimization problem, this study explores the computational cost of multidimensional integration (computing expectation) and its interplay with optimization algorithms. It concludes that straightforward application of standard offtheshelf optimization software to robust design is prohibitively expensive, necessitating adaptive strategies and the use of surrogates. Engineers increasingly rely on computer simulation to develop new products and to understand emerging technologies. In practice, this process is permeated with uncertainty: manufactured products deviate from designed products; actual products must perform under a variety of operating conditions. Most of the computational tools developed for design optimization
OF ATTACK FOR HIGH LIFT DEVICES OPTIMIZATION
"... Abstract. In this report, we address aerodynamic shape optimization problems including uncertain operating conditions. After a review of robust control theory and the possible approaches to take into uncertainty, we propose to use Taguchi robust design methods in order to overcome single point desig ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In this report, we address aerodynamic shape optimization problems including uncertain operating conditions. After a review of robust control theory and the possible approaches to take into uncertainty, we propose to use Taguchi robust design methods in order to overcome single point design problems in Aerodynamics. The latter techniques produce solutions that perform well for the selected design point but have poor offdesign performance. Under the conduct of Taguchi concept, a design with uncertainties is converted into an optimization problem with two objectives which are mean performance and its variance, so that the solutions are as less insensitive to the uncertainty of the input parameters as possible. Furthermore, the MultiCriterion Evolutionary Algorithms (MCEAs) are used to capture a set of compromised solutions (Pareto front) between these two objectives. The flow field is analyzed by NavierStokes computation. In order to reduce the number of expensive evaluations of fitness function, Response Surface Modelling (RSM) is employed to estimate fitness value using the polynomial approximate model. The proposed approach is applied to the robust optimization of the 2D high lift devices of a business aircraft, by maximizing the mean and minimizing the variance of the lift coefficients with uncertain freestream angle of attack at landing and takeoff flight conditions respectively
New Methods for Robust Design Using
"... Engineers increasingly rely on computer simulation to develop new products and to understand emerging technologies. In practice, this process is permeated with uncertainty: manufactured products deviate from designed products; actual products must perform under a variety of operating conditions. In ..."
Abstract
 Add to MetaCart
Engineers increasingly rely on computer simulation to develop new products and to understand emerging technologies. In practice, this process is permeated with uncertainty: manufactured products deviate from designed products; actual products must perform under a variety of operating conditions. In recent years, engineers have become increasingly concerned with managing these uncertainties. Statistical decision theory, specifically the Bayes principle, provides a conceptual framework for quantifying uncertainty. The difficulty with exploiting this framework is computational, involving the numerical integration of expensive simulation outputs with respect to uncertain quantities. The application of statistical decision theory to robust design has been infrequently attempted and lies at the frontier of current engineering practice. This paper surveys several such attempts. Inspired by the success of surrogatebased methods for design optimization, we speculate about the prospects for developing surrogatebased methods for integration.
NOTE ON THE EFFECTIVENESS OF STOCHASTIC OPTIMIZATION ALGORITHMS FOR ROBUST DESIGN
"... Abstract: Robust design optimization (RDO) uses statistical decision theory and optimization techniques to optimize a design over a range of uncertainty (introduced by the manufacturing process and unintended uses). Since engineering objective functions tend to be costly to evaluate and prohibitiv ..."
Abstract
 Add to MetaCart
Abstract: Robust design optimization (RDO) uses statistical decision theory and optimization techniques to optimize a design over a range of uncertainty (introduced by the manufacturing process and unintended uses). Since engineering objective functions tend to be costly to evaluate and prohibitively expensive to integrate (required within RDO), surrogates are introduced to allow the use of traditional optimization methods to find solutions. This paper explores the suitability of radically different (deterministic and stochastic) optimization methods to solve prototypical robust design problems. The algorithms include a genetic algorithm using a penalty function formulation, the simultaneous perturbation 1 stochastic approximation (SPSA) method, and two gradientbased constrained nonlinear optimizers (method of feasible directions and sequential quadratic programming). The results show that the fully deterministic standard optimization algorithms are consistently more accurate, consistently more likely to terminate at feasible points, and consistently considerably less expensive than the fully nondeterministic algorithms.
The method of Sacks et al.
, 1994
"... In structural optimization, usually an approximation concept is introduced as interface between (FEM) structural analysis code and optimization algorithm. In some cases of optimm desigr., a glcba! apprcxhaticl? ccncept car. be effectively applied. Then, apprmirnatior. model functions are built of a ..."
Abstract
 Add to MetaCart
In structural optimization, usually an approximation concept is introduced as interface between (FEM) structural analysis code and optimization algorithm. In some cases of optimm desigr., a glcba! apprcxhaticl? ccncept car. be effectively applied. Then, apprmirnatior. model functions are built of all objective and constraint functions, whose values, for a certain design point, follow from the structural analysis calculations. In this way, the original o p timization problem is completely replaced by an explicitly known approximate optimization problem. Responsesurface techniques are commonly applied to build global approximation models, especially when dealing with responses of physical experiments. For one response function, this means that a user defined model function is fitted to the response data calculated at the design sites of some experimental design. Errors between model function and experimental response values are assumed to be randomly distributed. However, in the structural optimization case, an experiment is a computer analysis with a deterministic response as a result. This rules out the statistical assumptions responsesurface model building is based