Results 11  20
of
119
Evolutionary Search of Approximated NDimensional Landscapes
 International Journal of Knowledgebased Intelligent Engineering Systems
, 2000
"... Finding the global optimum on a large, multimodal, complex, and discontinuous (or nondifferentiable) landscape is usually very hard, even using the evolutionary approach. However, some of these complex landscapes can be approximated and smoothened without changing the nature of the problem, i.e., wi ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
(Show Context)
Finding the global optimum on a large, multimodal, complex, and discontinuous (or nondifferentiable) landscape is usually very hard, even using the evolutionary approach. However, some of these complex landscapes can be approximated and smoothened without changing the nature of the problem, i.e., without modifying the global optimum and its location. The approximated and smoothened landscape is often much easier to search than the original one. In this paper, we propose a new algorithm using landscape approximation and hybrid evolutionary and local search. We also list several algorithm design principles. Following the basic algorithm, an example algorithm is given from our previous work of the combination of landscape approximation and local search (LALS). Furthermore, we develop a novel evolutionary algorithm with ndimensional approximation (EANA), which shares the same rules as the basic algorithm, but remedies some of the drawbacks found in the LALS. Comparisons with evo...
Agus Sudjianto. Analysis of computer experiments using penalized likelihood in gaussian kriging models
 Journal of the American Statistical Association
"... Kriging is a popular analysis approach for computer experiments for the purpose of creating a cheaptocompute “metamodel ” as a surrogate to a computationally expensive engineering simulation model. The maximum likelihood approach is used to estimate the parameters in the kriging model. However, t ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
Kriging is a popular analysis approach for computer experiments for the purpose of creating a cheaptocompute “metamodel ” as a surrogate to a computationally expensive engineering simulation model. The maximum likelihood approach is used to estimate the parameters in the kriging model. However, the likelihood function near the optimum may be flat in some situations, which leads to maximum likelihood estimates for the parameters in the covariance matrix that have very large variance. To overcome this difficulty, a penalized likelihood approach is proposed for the kriging model. Both theoretical analysis and empirical experience using real world data suggest that the proposed method is particularly important in the context of a computationally intensive simulation model where the number of simulation runs must be kept small because collection of a large sample set is prohibitive. The proposed approach is applied to the reduction of piston slap, an unwanted engine noise due to piston secondary motion. Issues related to practical implementation of the proposed approach are discussed.
Gaussian Processes for Active Data Mining of Spatial Aggregates
 In Proceedings of the SIAM International Conference on Data Mining
, 2005
"... We present an active data mining mechanism for qualitative analysis of spatial datasets, integrating identification and analysis of structures in spatial data with targeted collection of additional samples. The mechanism is designed around the spatial aggregation language (SAL) for qualitative ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
(Show Context)
We present an active data mining mechanism for qualitative analysis of spatial datasets, integrating identification and analysis of structures in spatial data with targeted collection of additional samples. The mechanism is designed around the spatial aggregation language (SAL) for qualitative spatial reasoning, and seeks to uncover highlevel spatial structures from only a sparse set of samples. This approach is important for applications in domains such as aircraft design, wireless system simulation, fluid dynamics, and sensor networks. The mechanism employs Gaussian processes, a formal mathematical model for reasoning about spatial data, in order to build surrogate models from sparse data, reason about the uncertainty of estimation at unsampled points, and formulate objective criteria for closingtheloop between data collection and data analysis. It optimizes sample selection using entropybased functionals defined over spatial aggregates instead of the traditional approach of sampling to minimize estimated variance. We apply this mechanism on a global optimization benchmark comprising a testbank of 2D functions, as well as on data from wireless system simulations. The results reveal that the proposed sampling strategy makes more judicious use of data points by selecting locations that clarify highlevel structures in data, rather than choosing points that merely improve quality of function approximation.
Hardwareintheloop optimization of the walking speed of a humanoid robot
 IN CLAWAR 2006: 9TH INTERNATIONAL CONFERENCE ON CLIMBING AND WALKING ROBOTS
, 2006
"... The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 which is equi ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
The development of optimized motions of humanoid robots that guarantee a fast and also stable walking is an important task especially in the context of autonomous soccer playing robots in RoboCup. We present a walking motion optimization approach for the humanoid robot prototype HR18 which is equipped with a low dimensional parameterized walking trajectory generator, joint motor controller and an internal stabilization. The robot is included as hardwareintheloop to define a low dimensional blackbox optimization problem. In contrast to previously performed walking optimization approaches we apply a sequential surrogate optimization approach using stochastic approximation of the underlying objective function and sequential quadratic programming to search for a fast and stable walking motion. This is done under the conditions that only a small number of physical walking experiments should have to be carried out during the online optimization process. For the identified walking motion for the considered 55 cm tall humanoid robot we measured a forward walking speed of more than 30 cm/sec. With a modified version of the robot even more than 40 cm/sec could be achieved in permanent operation.
Computational Methods in Optimization Considering Uncertainties  An Overview
"... This article presents a brief survey on some of the most relevant developments in the field of optimization under uncertainty. In particular, the scope and the relevance of the papers included in this Special Issue are analyzed. The importance of uncertainty quantification and optimization technique ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
This article presents a brief survey on some of the most relevant developments in the field of optimization under uncertainty. In particular, the scope and the relevance of the papers included in this Special Issue are analyzed. The importance of uncertainty quantification and optimization techniques for producing improved models and designs is thoroughly discussed. The focus of the discussion is in three specific research areas, namely reliabilitybased optimization, robust design optimization and model updating. The arguments presented indicate that optimization under uncertainty should become customary in engineering design in the foreseeable future. Computational aspects play a key role in analyzing and modeling realistic systems and structures.
A multipoints criterion for deterministic parallel global optimization based on gaussian processes
 Journal of Global Optimization, in revision
, 2009
"... The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a measure of uncertainty, they proceed by iteratively choosing the point maximizing a criterion which is a compromise between predicted performance and uncertainty. Distributing the evaluation of such numerically expensive objective functions on many processors is an appealing idea. Here we investigate a multipoints optimization criterion, the multipoints expected improvement (qEI), aimed at choosing several points at the same time. An analytical expression of the qEI is given when q = 2, and a consistent statistical estimate is given for the general case. We then propose two classes of heuristic strategies meant to approximately optimize the qEI, and apply them to Gaussian Processes and to the classical BraninHoo testcase function. It is finally demonstrated within the covered example that the latter strategies perform as good as the best Latin Hypercubes and Uniform Designs ever found by simulation (2000 designs drawn at random for every q ∈ [1, 10]).
Expert knowledge and multivariate emulation: the thermosphereionosphere electrodynamics general circulation model (TIEGCM
 Technometrics
, 2009
"... The TIEGCM simulator of the upper atmosphere has a number of features that are a challenge to standard approaches to emulation, such as a long runtime, multivariate output, periodicity, and strong constraints on the interrelationship between inputs and outputs. These kinds of features are not unu ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
(Show Context)
The TIEGCM simulator of the upper atmosphere has a number of features that are a challenge to standard approaches to emulation, such as a long runtime, multivariate output, periodicity, and strong constraints on the interrelationship between inputs and outputs. These kinds of features are not unusual in models of complex systems. We show how they can be handled in an emulator, and demonstrate the use of the Outer Product Emulator for efficient calculation, with an emphasis on predictive diagnostics for model choice and model validation. We use our emulator to ‘verify ’ the underlying computer code, and to quantify our qualitative physical understanding.
Uncertainty Quantification In Large Computational Engineering Models
 In Proceedings of the 42rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, number AIAA20011455
, 2001
"... While a wealth of experience in the development of uncertainty quantification methods and software tools exists at present, a cohesive software package utilizing massively parallel computing resources does not. The thrust of the work to be discussed herein is the development of such a toolkit, which ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
While a wealth of experience in the development of uncertainty quantification methods and software tools exists at present, a cohesive software package utilizing massively parallel computing resources does not. The thrust of the work to be discussed herein is the development of such a toolkit, which has leveraged existing software frameworks (e.g., DAKOTA (Design Analysis Kit for OpTimizAtion)) where possible, and has undertaken additional development efforts when necessary. The contributions of this paper are twofold. One, the design and structure of the toolkit from a software perspective will be discussed, detailing some of its distinguishing features. Second, the toolkit's capabilities will be demonstrated by applying a subset of its available uncertainty quantification techniques to an example problem involving multiple engineering disciplines, nonlinear solid mechanics and soil mechanics. This example problem will demonstrate the toolkit's suitability in quantifying uncertainty in engineering applications of interest modeled using very large computational system models.
ACCURATE EMULATORS FOR LARGESCALE COMPUTER EXPERIMENTS
, 1203
"... Largescale computer experiments are becoming increasingly important in science. A multistep procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator inmultiple steps. Inpractice, the procedureshows substantial improvements in overall accuracy, b ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Largescale computer experiments are becoming increasingly important in science. A multistep procedure is introduced to statisticians for modeling such experiments, which builds an accurate interpolator inmultiple steps. Inpractice, the procedureshows substantial improvements in overall accuracy, but its theoretical properties are not well established. We introduce the terms nominal and numeric error and decompose the overall error of an interpolator into nominal and numeric portions. Bounds on the numeric and nominal error are developed to show theoretically that substantial gains in overall accuracy can be attained with the multistep approach.
Smoothing Spline Analysis Of Variance For Polychotomous Response Data
, 1998
"... We consider the penalized likelihood method with smoothing spline ANOVA for estimating nonparametric functions to data involving a polychotomous response. The fitting procedure involves minimizing the penalized likelihood in a Reproducing Kernel Hilbert Space. One Step Block SORNewtonRaphson Algor ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
We consider the penalized likelihood method with smoothing spline ANOVA for estimating nonparametric functions to data involving a polychotomous response. The fitting procedure involves minimizing the penalized likelihood in a Reproducing Kernel Hilbert Space. One Step Block SORNewtonRaphson Algorithm is used to solve the minimization problem. Generalized CrossValidation or unbiased risk estimation is used to empirically assess the amount of smoothing (which controls the bias and variance tradeoff) at each onestep Block SORNewtonRaphson iteration. Under some regular smoothness conditions, the onestep Block SORNewtonRaphson will produce a sequence which converges to the minimizer of the penalized likelihood for the fixed smoothing parameters. Monte Carlo simulations are conducted to examine the performance of the algorithm. The method is applied to polychotomous data from the Wisconsin Epidemiological Study of Diabetic Retinopathy to estimate the risks of causespecific mortality given several potential risk factors at the start of the study. Strategies to obtain smoothing spline estimates for large data sets with polychotomous response are also proposed in this thesis. Simulation studies are conducted to check the performance of the proposed method. ii Acknowledgements I would like to express my sincerest gratitude to my advisor, Professor Grace Wahba, for her invaluable advice during the course of this dissertation. Appreciation is extended to Professors Michael Kosorok, Mary Lindstrom, Olvi Mangasarian, and KamWah Tsui for their service on my final examination committee, their careful reading of this thesis and their valuable comments. I would like to thank Ronald Klein, MD and Barbara Klein, MD for providing the WESDR data. Fellow graduate students Fangy...