Results 1  10
of
52
Single and Multiobjective Evolutionary Optimization Assisted by Gaussian Random Field Metamodels
"... This paper presents and analyzes in detail an efficient search method based on Evolutionary Algorithms (EA) assisted by local Gaussian Random Field Metamodels (GRFM). It is created for the use in optimization problems with computationally expensive evaluation function(s). The role of GRFM is to pred ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
This paper presents and analyzes in detail an efficient search method based on Evolutionary Algorithms (EA) assisted by local Gaussian Random Field Metamodels (GRFM). It is created for the use in optimization problems with computationally expensive evaluation function(s). The role of GRFM is to predict objective function values for new candidate solutions by exploiting information recorded during previous evaluations. Moreover, GRFM are able to provide estimates of the confidence of their predictions.
Local metamodels for optimization using evolution strategies
 Parallel Problem Solving from Nature  PPSN IX
, 2006
"... Abstract. We employ local metamodels to enhance the efficiency of evolution strategies in the optimization of computationally expensive problems. The method involves the combination of second order local regression metamodels with the Covariance Matrix Adaptation Evolution Strategy. Experiments on ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We employ local metamodels to enhance the efficiency of evolution strategies in the optimization of computationally expensive problems. The method involves the combination of second order local regression metamodels with the Covariance Matrix Adaptation Evolution Strategy. Experiments on benchmark problems demonstrate that the proposed metamodels have the potential to reliably account for the ranking of the offspring population resulting in significant computational savings. The results show that the use of local metamodels significantly increases the efficiency of already competitive evolution strategies. 1
Using Gaussian Processes to Optimize Expensive Functions.
"... Abstract. The task of finding the optimum of some function f(x) is commonly accomplished by generating and testing sample solutions iteratively, choosing each new sample x heuristically on the basis of results to date. We use Gaussian processes to represent predictions and uncertainty about the true ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The task of finding the optimum of some function f(x) is commonly accomplished by generating and testing sample solutions iteratively, choosing each new sample x heuristically on the basis of results to date. We use Gaussian processes to represent predictions and uncertainty about the true function, and describe how to use these predictions to choose where to take each new sample in an optimal way. By doing this we were able to solve a difficult optimization problem finding weights in a neural network controller to simultaneously balance two vertical poles using an order of magnitude fewer samples than reported elsewhere. 1
Memetic algorithm using multisurrogates for computationally expensive optimization problems, Soft Comput 11
, 2007
"... be inserted by the editor) ..."
(Show Context)
A memorybased rash optimizer
 IN AAAI06 WORKSHOP ON HEURISTIC SEARCH, MEMORY BASED HEURISTICS AND THEIR APPLICATIONS
, 2006
"... This paper presents a memorybased Reactive Affine Shaker (MRASH) algorithm for global optimization. The Reactive Affine Shaker is an adaptive search algorithm based only on the function values. MRASH is an extension of RASH in which good starting points to RASH are suggested online by using Bayes ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
This paper presents a memorybased Reactive Affine Shaker (MRASH) algorithm for global optimization. The Reactive Affine Shaker is an adaptive search algorithm based only on the function values. MRASH is an extension of RASH in which good starting points to RASH are suggested online by using Bayesian Locally Weighted Regression (BLWR). Both techniques use the memory about the previous history of the search to guide the future exploration but in very different ways. RASH compiles the previous experience into a local search area where sample points are drawn, while locallyweighted regression saves the entire previous history to be mined extensively when an additional sample point is generated. Because of the high computational cost related to the BLWR model, it is applied only to evaluate the potential of an initial point for a local search run. The experimental results, focussed onto the case when the dominant computational cost is the evaluation of the target f function, show that MRASH is indeed capable of leading to good results for a smaller number of function evaluations.
A Study on Metamodeling Techniques, Ensembles, and MultiSurrogates in Evolutionary Computation ABSTRACT
"... SurrogateAssisted Memetic Algorithm(SAMA) is a hybrid evolutionary algorithm, particularly a memetic algorithm that employs surrogate models in the optimization search. Since most of the objective function evaluations in SAMA are approximated, the search performance of SAMA is likely to be affected ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
SurrogateAssisted Memetic Algorithm(SAMA) is a hybrid evolutionary algorithm, particularly a memetic algorithm that employs surrogate models in the optimization search. Since most of the objective function evaluations in SAMA are approximated, the search performance of SAMA is likely to be affected by the characteristics of the models used. In this paper, we study the search performance of using different metamodeling techniques, ensembles, and multisurrogates in SAMA. In particular, we consider the SAMATRF, a SAMA model management framework that incorporates a trust region scheme for interleaving use of exact objective function with computationally cheap local metamodels during local searches. Four different metamodels, namely Gaussian Process (GP), Radial Basis Function (RBF), Polynomial Regression (PR), and Extreme Learning Machine
Global Optimization Using Hybrid Approach
 SMO'07), 7th WSEAS International Conference on simulation, modelling and optimization
"... Abstract:The paper deals with a global optimization algorithm using hybrid approach. To take the advantage of global search capability the evolution strategy (ES) with some modifications in recombination formulas and elites keeping is used first to find the nearoptimal solutions. The sequential qu ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Abstract:The paper deals with a global optimization algorithm using hybrid approach. To take the advantage of global search capability the evolution strategy (ES) with some modifications in recombination formulas and elites keeping is used first to find the nearoptimal solutions. The sequential quadratic programming(SQP) is then used to find the exact solution from the solutions found by ES. One merit of the algorithm is that the solutions for multimodal problems can be found in a single run. Eight popular test problems are used to test the proposed algorithm. The results are satisfactory in quality and efficiency. KeyWords:Global optimization algorithm, hybrid approach, evolution strategy 1
CuriosityDriven Optimization
 IN IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC
, 2011
"... The principle of artificial curiosity directs active exploration towards the most informative or most interesting data. We show its usefulness for global black box optimization when data point evaluations are expensive. Gaussian process regression is used to model the fitness function based on all ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
The principle of artificial curiosity directs active exploration towards the most informative or most interesting data. We show its usefulness for global black box optimization when data point evaluations are expensive. Gaussian process regression is used to model the fitness function based on all available observations so far. For each candidate point this model estimates expected fitness reduction, and yields a novel closedform expression of expected information gain. A new type of Paretofront algorithm continually pushes the boundary of candidates not dominated by any other known data according to both criteria, using multiobjective evolutionary search. This makes the explorationexploitation tradeoff explicit, and permits maximally informed data selection. We illustrate the robustness of our approach in a number of experimental scenarios.
Clustered Multiple Generalized Expected Improvement: A Novel Infill Sampling Criterion for Surrogate Models
 in Congress on Evolutionary Computation
, 2008
"... Abstract — Surrogate modelbased optimization is a wellknown technique for optimizing expensive blackbox functions. By applying this function approximation, the number of real problem evaluations can be reduced because the optimization is performed on the model. In this case two contradictory targe ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract — Surrogate modelbased optimization is a wellknown technique for optimizing expensive blackbox functions. By applying this function approximation, the number of real problem evaluations can be reduced because the optimization is performed on the model. In this case two contradictory targets have to be achieved: increasing global model accuracy and exploiting potentially optimal areas. The key to these targets is the criterion for selecting the next point, which is then evaluated on the expensive blackbox function – the ’infill sampling criterion’. Therefore, a novel approach – the ’Clustered Multiple Generalized Expected Improvement ’ (CMGEI) – is introduced and motivated by an empirical study. Furthermore, experiments benchmarking its performance compared to the state of the art are presented. I.
aerodynamic design
"... Study of some strategies for global optimization using Gaussian process models with application to aerodynamic design ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Study of some strategies for global optimization using Gaussian process models with application to aerodynamic design