Results 1  10
of
29
A rigorous framework for optimization of expensive functions by surrogates
 Structural Optimization
, 1999
"... Abstract. The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which direct application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of ..."
Abstract

Cited by 164 (16 self)
 Add to MetaCart
Abstract. The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which direct application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31variable helicopter rotor blade design example and for a standard optimization test example. Key words. approximation concepts, surrogate optimization, response surfaces, pattern search methods, derivativefree optimization, design and analysis of computer experiments (DACE), computational engineering. Subject classication. Applied & Numerical Mathematics 1. Introduction. The
Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods
 SIAM REVIEW VOL. 45, NO. 3, PP. 385–482
, 2003
"... Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked ..."
Abstract

Cited by 143 (14 self)
 Add to MetaCart
(Show Context)
Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
Using Approximations to Accelerate Engineering Design Optimization
 Proceedings of the 7th AIAA/USAF/NASA/ISSMO Multidisciplinary Analysis & Optimization Symposium (held at Saint Louis, Missouri), Paper 984800
, 1998
"... Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to dene the engineering optimization problem usually require the solution of ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to dene the engineering optimization problem usually require the solution of dierential equations, a process which is itself computationally intensive. Within a standard nonlinear optimization algorithm, the solution of these dierential equations is required for each iteration of the algorithm. To mitigate such expense, an attractive alternative is to replace the computationally intensive objective with a less expensive surrogate. In conformance with engineering practice, we draw a crucial distinction between surrogate models and surrogate approximations. Surrogate models are auxiliary simulations that are less physically faithful, but also less computationally expensive, than the expensive simulation that is regarded as \truth. " An instructive example is the use of an equivalentplate analysis method in lieu of a nite element analysis, e.g. to analyze a wingbox of a highspeed civil transport. Surrogate models exist independently of the expensive simulation and can provide new information about the physical phenomenon of interest without requiring additional runs
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
(Show Context)
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
Snobfit  Stable Noisy Optimization by Branch and Fit
"... this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; h ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
this paper produces a userspeci ed number of suggested evaluation points in each step; proceeds by successive partitioning of the box (branch) and building local quadratic models ( t); combines local and global search and allows the user to determine which of both should be emphasized; handles local search from the best point with the aid of trust regions; allows for hidden constraints and assigns to such points a function value based on the function values of nearby feasible points
MetamodelAssisted Evolution Strategies
 In Parallel Problem Solving from Nature VII
, 2002
"... This paper presents various MetamodelAssisted Evolution Strategies which reduce the computational cost of optimisation problems involving timeconsuming function evaluations. The metamodel is built using previously evaluated solutions in the search space and utilized to predict the fitness of ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
(Show Context)
This paper presents various MetamodelAssisted Evolution Strategies which reduce the computational cost of optimisation problems involving timeconsuming function evaluations. The metamodel is built using previously evaluated solutions in the search space and utilized to predict the fitness of new candidate solutions. In addition to previous works by the authors, the new metamodel takes also into account the error associated with each prediction, by correlating neighboring points in the search space. A mathematical problem and the problem of designing an optimal airfoil shape under viscous flow considerations have been worked out. Both demonstrate the noticeable gain in computational time one might expect from the use of metamodels in Evolution Strategies.
Comparison of response surface and Kriging models for multidisciplinary design optimization
 In: Seventh AIAA/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, AIAA
, 1998
"... In this paper, we compare and contrast the use of secondorder response surface models and kriging models for approximating nonrandom, deterministic computer analyses. After reviewing the response surface method for constructing polynomial approximations, kriging is presented as an alternative appr ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
In this paper, we compare and contrast the use of secondorder response surface models and kriging models for approximating nonrandom, deterministic computer analyses. After reviewing the response surface method for constructing polynomial approximations, kriging is presented as an alternative approximation method for the design and analysis of computer experiments. Both methods are applied to the multidisciplinary design of an aerospike nozzle which consists of a computational fluid dynamics model and a finiteelement model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations, and four optimization problems are formulated and solved using both sets of approximation
From Evolutionary Operation to Parallel Direct Search: Pattern Search Algorithms for Numerical Optimization
 Computing Science and Statistics
, 1998
"... G.E.P. Box's seminal suggestions for Evolutionary Operation led other statisticians to propose algorithms for numerical optimization that rely exclusively on the direct comparison of function values. These contributions culminated in the development of the widely used simplex algorithm of Nelde ..."
Abstract

Cited by 23 (10 self)
 Add to MetaCart
G.E.P. Box's seminal suggestions for Evolutionary Operation led other statisticians to propose algorithms for numerical optimization that rely exclusively on the direct comparison of function values. These contributions culminated in the development of the widely used simplex algorithm of Nelder and Mead. Recent examination of these popular methods by the numerical optimization community has produced new insights. Numerical experiments and carefully constructed examples have revealed that the NelderMead algorithm may be unreliable even in fairly simple situations. In contrast, many of the original methods, which we collectively describe as pattern searches, are guaranteed to converge to a stationary point of the objective function under conventional nonlinear programming assumptions. In addition, the structure of these algorithms is such that they are easily parallelized. We will briefly survey the history of pattern search methods and explicate their common structure, pointing out th...
A Framework for Managing Models in Nonlinear Optimization of Computationally Expensive Functions
, 1998
"... One of the most significant problems in the application of standard optimization methods to realworld engineering design problems is that the computation of the objective function often takes so much computer time (sometimes hours) that traditional optimization techniques are not practical. A sol ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
One of the most significant problems in the application of standard optimization methods to realworld engineering design problems is that the computation of the objective function often takes so much computer time (sometimes hours) that traditional optimization techniques are not practical. A solution that has long been used in this situation has been to approximate the objective function with something much cheaper to compute, called a "model" (or surrogate), and optimize the model instead of the actual objective function. This simple approach succeeds some of the time, but sometimes it fails because there is not sufficient a priori knowledge to build an adequate model. One way to address this problem is to build the model with whatever a priori knowledge is available, and during the optimization process sample the true objective at selected points and use the results to monitor the progress of the optimization and to adapt the model in the region of interest. We call this approach "model management". This thesis will build on the fundamental ideas and theory of pattern search optimization methods to develop a rigorous methodology for model management. A general framework for model management algorithms will iv be presented along with a convergence analysis. A software implementation of the framework, which allows for the reuse of existing modeling and optimization software, has been developed and results for several test problems will be presented. The model management methodology and potential applications in aerospace engineering are the subject of an ongoing collaboration between researchers at Boeing, IBM, Rice and College of William & Mary.
Review of the Space Mapping Approach to Engineering Optimization and Modeling
 OPTIMIZATION AND ENGINEERING
, 2000
"... We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physicallybased models is exploited. SM d ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physicallybased models is exploited. SM drives the optimization iterates of the timeintensive model using the fast model. Several algorithms have been developed for SM optimization, including the original SM algorithm, Aggressive Space Mapping (ASM), Trust Region Aggressive Space Mapping (TRASM) and Hybrid Aggressive Space Mapping (HASM). An essential subproblem of any SM based optimization algorithm is parameter extraction. The uniqueness of this optimization subproblem has been crucial to the success of SM optimization. Different approaches to enhance the uniqueness are reviewed. We also discuss new developments in Space Mappingbased Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mappingbased Neuromodeling (SMN). Finally, we address open points for research and future development.