Results 1  10
of
16
Computationally inexpensive metamodel assessment strategies
 AIAA Journal
, 2002
"... In many scienti � c and engineering domains, it is common to analyze and simulate complex physical systems using mathematical models. Although computing resources continue to increase in power and speed, computer simulation and analysis codes continue to grow in complexity and remain computationally ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
In many scienti � c and engineering domains, it is common to analyze and simulate complex physical systems using mathematical models. Although computing resources continue to increase in power and speed, computer simulation and analysis codes continue to grow in complexity and remain computationally expensive, limiting their use in design and optimization. Consequently, many researchers have developed different metamodeling strategies to create inexpensive approximations of computationally expensive computer simulations. These approximations introduce a new element of uncertainty during design optimization, and there is a need to develop ef � cient methods to assess metamodel validity. We investigate computationally inexpensive assessment methods for metamodel validation based on leavekout cross validation and develop guidelines for selecting k for different types of metamodels. Based on the results from two sets of test problems, k = 1 is recommended for leavekout cross validation of loworder polynomial and radial basis function metamodels, whereas k = 0:1N or N is recommended for kriging metamodels, where N is the number of sample points used to construct the metamodel. Nomenclature N = number of sample points x = design (input) variable y = actual output (response) value Oyi = predicted output (response) value from metamodel I.
StateoftheArt Review: A User’s Guide to the Brave New World of Designing Simulation Experiments
 INFORMS Journal on Computing
, 2005
"... informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expe ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expertise who want to select a design and an appropriate analysis for their experiments. Furthermore, we provide a research agenda listing problems in the design of simulation experiments—as opposed to realworld experiments—that require more investigation. We consider three types of practical problems: (1) developing a basic understanding of a particular simulation model or system, (2) finding robust decisions or policies as opposed to socalled optimal solutions, and (3) comparing the merits of various decisions or policies. Our discussion emphasizes aspects that are typical for simulation, such as having many more factors than in realworld experiments, and the sequential nature of the data collection. Because the same problem type may be addressed through different design types, we discuss quality attributes of designs, such as the ease of design construction, the flexibility for analysis, and efficiency considerations. Moreover, the selection of the design type depends on the metamodel (response surface) that the analysts tentatively assume; for
Neural network and regression spline value function approximations for stochastic dynamic programming,” Computers and Operations Research
, 2006
"... Dynamic programming is a multistage optimization method that is applicable to many problems in engineering. A statistical perspective of value function approximation in highdimensional, continuousstate stochastic dynamic programming (SDP) was first presented using orthogonal array (OA) experimenta ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Dynamic programming is a multistage optimization method that is applicable to many problems in engineering. A statistical perspective of value function approximation in highdimensional, continuousstate stochastic dynamic programming (SDP) was first presented using orthogonal array (OA) experimental designs and multivariate adaptive regression splines (MARS). Given the popularity of artificial neural networks (ANNs) for highdimensional modeling in engineering, this paper presents an implementation of ANNs as an alternative to MARS. Comparisons consider the differences in methodological objectives, computational complexity, model accuracy, and numerical SDP solutions. Two applications are presented: a ninedimensional inventory forecasting problem and an eightdimensional water reservoir problem. Both OAs and OAbased Latin hypercube experimental designs are explored, and OA spacefilling quality is considered.
Abstract Review of Metamodeling Techniques in Support of Engineering Design Optimization
"... Computationintensive design problems are becoming increasingly common in manufacturing industries. The computation burden is often caused by expensive analysis and simulation processes in order to reach a comparable level of accuracy as physical testing data. To address such a challenge, approximat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Computationintensive design problems are becoming increasingly common in manufacturing industries. The computation burden is often caused by expensive analysis and simulation processes in order to reach a comparable level of accuracy as physical testing data. To address such a challenge, approximation or metamodeling techniques are often used. Metamodeling techniques have been developed from many different disciplines including statistics, mathematics, computer science, and various engineering disciplines. The metamodels are initially developed as “surrogates ” of the expensive simulation process in order to improve the overall computation efficiency. They are then found to be a valuable tool to support a wide scope of activities in modern engineering design, especially design optimization. This work reviews the stateoftheart metamodelbased techniques from a practitioner’s perspective according to the role of metamodeling in supporting design optimization, including model approximation, design space exploration, problem formulation, and solving various types of optimization problems. Challenges and future development of metamodeling in support of engineering design is also analyzed and discussed.
On Sequential Sampling for Global . . .
, 2002
"... Approximation models (also known as metamodels) have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the sampling strategies used. Our goal ..."
Abstract
 Add to MetaCart
Approximation models (also known as metamodels) have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the sampling strategies used. Our goal in this paper is to investigate the general applicability of sequential sampling for creating global metamodels. Various sequential sampling approaches are reviewed and new approaches are proposed. The performances of these approaches are investigated against that of the onestage approach using a set of test problems with a variety of features. The potential usages of sequential sampling strategies are also discussed.
ReducedOrder Nonlinear Unsteady Aerodynamic Modeling Using a SurrogateBased Recurrence Framework
"... A reducedorder nonlinear unsteady aerodynamic modeling approach suitable for analyzing pitching/plunging airfoils subject to fixed or timevarying freestream Mach numbers is described. The reducedorder model uses kriging surrogates to account for flow nonlinearities and recurrence solutions to acc ..."
Abstract
 Add to MetaCart
A reducedorder nonlinear unsteady aerodynamic modeling approach suitable for analyzing pitching/plunging airfoils subject to fixed or timevarying freestream Mach numbers is described. The reducedorder model uses kriging surrogates to account for flow nonlinearities and recurrence solutions to account for timehistory effects associated with unsteadiness. The resulting surrogatebased recurrence framework generates timedomain predictions of unsteady lift, moment, and drag that accurately approximate computational fluid dynamics solutions, but at a fraction of the computational cost. Results corresponding to transonic conditions demonstrate that the surrogatebased recurrence framework can mimic computational fluid dynamics predictions of unsteady aerodynamic responses when flow nonlinearities are present. For an unsteady aerodynamic modeling problem considered in this study, an accurate reducedorder model was generated by the surrogatebased recurrence framework approach with significantly fewer computational fluid dynamics evaluations compared to results reported in the literature for a similar problem in which a properorthogonaldecompositionbased approach was applied. Furthermore, the results show that the surrogatebased approach can accurately model timevarying freestream Mach number effects and is therefore applicable to rotarywing applications in addition to fixedwing applications. Nomenclature
Using Cross Validation to Design Conservative Surrogates
"... The use of surrogates (also known as metamodels) for facilitating optimization and statistical analysis of computationally expensive simulations has become commonplace. Surrogate models are usually fit to be unbiased (i.e., the error expectation is zero). However, in certain applications, it might b ..."
Abstract
 Add to MetaCart
The use of surrogates (also known as metamodels) for facilitating optimization and statistical analysis of computationally expensive simulations has become commonplace. Surrogate models are usually fit to be unbiased (i.e., the error expectation is zero). However, in certain applications, it might be important to safely estimate the response (e.g., in structural analysis, the maximum stress must not be underestimated in order to avoid failure). In this work we use safety margins to conservatively compensate for fitting errors associated with surrogates. We propose the use of cross validation for estimating the required safety margin for a desired level of conservativeness (percentage of safe predictions). The approach was tested on three algebraic examples for two basic surrogates: namely, kriging and polynomial response surface. For these examples we found that cross validation is effective for selecting the safety margin. We also applied the approach to the probabilistic design optimization of a composite laminate. This design under uncertainty example showed that the approach can be successfully used in engineering applications. Nomenclature CIS = standard estimated confidence interval CIW = Wilson’s estimated confidence interval e x = prediction error (e x ^y x y x) eRMS = root mean square error eXV = vector of crossvalidation errors PRESS = prediction sum of squares REG = relative error growth (loss in accuracy) REGXV = relative error growth (estimated loss in accuracy) based on crossvalidation data s = safety margin ^s = estimated safety margin y x = actual function ^y x = surrogate model of the actual function ^y C x = conservative surrogate model of the actual function %c = percentage of conservative (i.e., positive) errors I.
Nomenclature
, 2008
"... In this paper, we compare the global accuracy of different strategies to build response surfaces by varying sampling methods and modeling techniques. The final application of the response surfaces being aerodynamic shape optimization, the test cases are issued from CFD simulations of aerodynamic coe ..."
Abstract
 Add to MetaCart
In this paper, we compare the global accuracy of different strategies to build response surfaces by varying sampling methods and modeling techniques. The final application of the response surfaces being aerodynamic shape optimization, the test cases are issued from CFD simulations of aerodynamic coefficients. For comparisons, a robust strategy for model fit using a new efficient initialization technique followed by a gradient optimization was applied. Firstly, a study of different sampling methods proves that including ’a posteriori ’ information on the function to sample distribution can improve accuracy over classical space filling methods like Latin Hypercube Sampling. Secondly, comparing Kriging and gradient enhanced Kriging on two to six dimensional test cases shows that interpolating gradient vectors drastically improves response surface accuracy. Although direct and indirect Cokriging have equivalent formulations, the indirect Cokriging outperforms the direct approach. The slow linear phase of error convergence when increasing sample size is not avoided by Cokriging. Thus, the number of samples needed to have a globally accurate surface stays generally out of reach for problems considering more than four design variables.
Building Effcient Response Surfaces of Aerodynamic Functions with Kriging and Cokriging
 AIAA JOURNAL
, 2008
"... In this paper, we compare the global accuracy of different strategies to build response surfaces by varying sampling methods and modeling techniques. The final application of the response surfaces being aerodynamic shape optimization, the test cases are issued from CFD simulations of aerodynamic coe ..."
Abstract
 Add to MetaCart
In this paper, we compare the global accuracy of different strategies to build response surfaces by varying sampling methods and modeling techniques. The final application of the response surfaces being aerodynamic shape optimization, the test cases are issued from CFD simulations of aerodynamic coefficients. For comparisons, a robust strategy for model fit using a new efficient initialization technique followed by a gradient optimization was applied. Firstly, a study of different sampling methods proves that including ’a posteriori ’ information on the function to sample distribution can improve accuracy over classical space filling methods like Latin Hypercube Sampling. Secondly, comparing Kriging and gradient enhanced Kriging on two to six dimensional test cases shows that interpolating gradient vectors drastically improves response surface accuracy. Although direct and indirect Cokriging have equivalent formulations, the indirect Cokriging outperforms the direct approach. The slow linear phase of error convergence when increasing sample size is not avoided by Cokriging. Thus, the number of samples needed to have a globally accurate surface stays generally out of reach for problems considering more than four design variables.
A Memetic Algorithm Assisted by an Adaptive Topology RBF Network and Variable Local Models for Expensive Optimization Problems
"... A common practice in modern engineering is that of simulationdriven optimization. This implies replacing costly and lengthy laboratory experiments with computer experiments, i.e. computationallyintensive simulations which model real world physics with high fidelity. Due to the complexity of such s ..."
Abstract
 Add to MetaCart
A common practice in modern engineering is that of simulationdriven optimization. This implies replacing costly and lengthy laboratory experiments with computer experiments, i.e. computationallyintensive simulations which model real world physics with high fidelity. Due to the complexity of such simulations a single simulation run can require up to