Results 1  10
of
36
Evolutionary optimization in uncertain environments  a survey
 IEEE TRANS. ON EVOL. COMPUTATION
, 2005
"... Evolutionary algorithms often have to solve optimization problems in the presence of a wide range of uncertainties. Generally, uncertainties in evolutionary computation can be divided into the following four categories. First, the fitness function is noisy. Second, the design variables and/or the e ..."
Abstract

Cited by 226 (17 self)
 Add to MetaCart
Evolutionary algorithms often have to solve optimization problems in the presence of a wide range of uncertainties. Generally, uncertainties in evolutionary computation can be divided into the following four categories. First, the fitness function is noisy. Second, the design variables and/or the environmental parameters may change after optimization, and the quality of the obtained optimal solution should be robust against environmental changes or deviations from the optimal point. Third, the fitness function is approximated, which means that the fitness function suffers from approximation errors. Fourth, the optimum of the problem to be solved changes over time and, thus, the optimizer should be able to track the optimum continuously. In all these cases, additional measures must be taken so that evolutionary algorithms are still able to work satisfactorily. This paper attempts to provide a comprehensive overview of the related work within a unified framework, which has been scattered in a variety of research areas. Existing approaches to addressing different uncertainties are presented and discussed, and the relationship between the different categories of uncertainties are investigated. Finally, topics for future research are suggested.
A Method for Handling Uncertainty in Evolutionary Optimization with an Application to Feedback Control of Combustion
"... Abstract — We present a novel method for handling uncertainty in evolutionary optimization. The method entails quantification and treatment of uncertainty and relies on the rank based selection operator of evolutionary algorithms. The proposed uncertainty handling is implemented in the context of th ..."
Abstract

Cited by 50 (14 self)
 Add to MetaCart
Abstract — We present a novel method for handling uncertainty in evolutionary optimization. The method entails quantification and treatment of uncertainty and relies on the rank based selection operator of evolutionary algorithms. The proposed uncertainty handling is implemented in the context of the Covariance Matrix Adaptation Evolution Strategy (CMAES) and verified on test functions. The present method is independent of the uncertainty distribution, prevents premature convergence of the evolution strategy and is well suited for online optimization as it requires only a small number of additional function evaluations. The algorithm is applied in an experimental setup to the online optimization of feedback controllers of thermoacoustic instabilities of gas turbine combustors. In order to mitigate these instabilities, gaindelay or modelbased H ∞ controllers sense the pressure and command secondary fuel injectors. The parameters of these controllers are usually specified via a trial and error procedure. We demonstrate that their online optimization with the proposed methodology enhances, in an automated fashion, the online performance of the controllers, even under highly unsteady operating conditions, and it also compensates for uncertainties in the modelbuilding and design process. I.
How To Analyse Evolutionary Algorithms
, 2002
"... Many variants of evolutionary algorithms have been designed and applied. The ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
Many variants of evolutionary algorithms have been designed and applied. The
A Simple Multimembered Evolution Strategy to Solve Constrained Optimization Problems
 IEEE Transactions on Evolutionary computation
, 2003
"... This paper presents a simple multimembered evolution strategy (SMES) to solve global nonlinear optimization problems. The approach does not require the use of a penalty function and it does not require any extra parameters (besides those used with an evolution strategy). Instead, it uses a simple di ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
(Show Context)
This paper presents a simple multimembered evolution strategy (SMES) to solve global nonlinear optimization problems. The approach does not require the use of a penalty function and it does not require any extra parameters (besides those used with an evolution strategy). Instead, it uses a simple diversity mechanism based on allowing infeasible solutions to remain in the population This technique helps the algorithm to find the global optimum despite reaching reasonably fast the feasible region of the search space. Some simple selection criteria are used to guide the process to the feasible region of the search space. Also, the initial step size of the evolution strategy is reduced in order to perform a finer search and a combined (discrete/intermediate) recombination technique improves its exploitation capabilities. The approach was tested with a wellknown benchmark. The results obtained are very competitive, when comparing the proposed approach against other stateofthe art techniques and its computational cost (measured by the number of fitness function evaluations) is lower than the required cost of the other techniques compared. 1
Efficient Search for Robust Solutions by Means of Evolutionary Algorithm and Fitness Approximation
 IEEE Transactions on Evolutionary Computation
, 2006
"... Abstract—For many realworld optimization problems, the robustness of a solution is of great importance in addition to the solution’s quality. By robustness, we mean that small deviations from the original design, e.g., due to manufacturing tolerances, should be tolerated without a severe loss of qu ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Abstract—For many realworld optimization problems, the robustness of a solution is of great importance in addition to the solution’s quality. By robustness, we mean that small deviations from the original design, e.g., due to manufacturing tolerances, should be tolerated without a severe loss of quality. One way to achieve that goal is to evaluate each solution under a number of different scenarios and use the average solution quality as fitness. However, this approach is often impractical, because the cost for evaluating each individual several times is unacceptable. In this paper, we present a new and efficient approach to estimating a solution’s expected quality and variance. We propose to construct local approximate models of the fitness function and then use these approximate models to estimate expected fitness and variance. Based on a variety of test functions, we demonstrate empirically that our approach significantly outperforms the implicit averaging approach, as well as the explicit averaging approaches using existing estimation techniques reported in the literature. Index Terms—Evolutionary optimization, fitness approximation, robustness, uncertainty. I.
R (2007) Evolutionary gradient search revisited
 IEEE Trans Evol Comput 11(4):480–495
"... Abstract—Evolutionary gradient search (EGS) is an approach to optimization that combines features of gradient strategies with ideas from evolutionary computation. Recently, several modifications to the algorithm have been proposed with the goal of improving its robustness in the presence of noise ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Evolutionary gradient search (EGS) is an approach to optimization that combines features of gradient strategies with ideas from evolutionary computation. Recently, several modifications to the algorithm have been proposed with the goal of improving its robustness in the presence of noise and its suitability for implementation on parallel computers. In this paper, the value of the proposed modifications is studied analytically. A scaling law is derived that describes the performance of the algorithm on the noisy sphere model and allows comparing it with competing strategies. The comparisons yield insights into the interplay of mutation, multirecombination, and selection. Then, the covariance matrix adaptation mechanism originally formulated for evolution strategies is adapted for use with EGS in order to make the algorithm competitive on objective functions with large condition numbers of their Hessians. The resulting strategy is evaluated experimentally on a number of convex quadratic test functions. Index Terms—Covariance matrix adaptation (CMA), evolution strategies, evolutionary gradient search (EGS), noise, quality gain analysis. I.
Selection in the presence of noise
 IN: PROCEEDINGS OF THE GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO), PP 766–777
, 2003
"... For noisy optimization problems, there is generally a tradeoff between the effort spent to reduce the noise (in order to allow the optimization algorithm to run properly), and the number of solutions evaluated during optimization. However, for stochastic search algorithms like evolutionary optimi ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
For noisy optimization problems, there is generally a tradeoff between the effort spent to reduce the noise (in order to allow the optimization algorithm to run properly), and the number of solutions evaluated during optimization. However, for stochastic search algorithms like evolutionary optimization, noise is not always a bad thing. On the contrary, in many cases, noise has a very similar effect to the randomness which is purposefully and deliberately introduced e.g. during selection. Using the example of stochastic tournament selection, we show that the noise inherent in the optimization problem should be taken into account by the selection operator, and that one should not reduce noise further than necessary.
On multiplicative noise models for stochastic search
 Proceedings of Parallel Problem Solving from Nature (PPSN X), volume 5199 of Lecture Notes in Computer Science
, 2008
"... Abstract. In this paper we investigate multiplicative noise models in the context of continuous optimization. We illustrate how some intrinsic properties of the noise model imply the failure of reasonable search algorithms for locating the optimum of the noiseless part of the objective function. Tho ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we investigate multiplicative noise models in the context of continuous optimization. We illustrate how some intrinsic properties of the noise model imply the failure of reasonable search algorithms for locating the optimum of the noiseless part of the objective function. Those findings are rigorously investigated on the (1 + 1)ES for the minimization of the noisy sphere function. Assuming a lower bound on the support of the noise distribution, we prove that the (1 + 1)ES diverges when the lower bound allows to sample negative fitness with positive probability and converges in the opposite case. We provide a discussion on the practical applications and non applications of those outcomes and explain the differences with previous results obtained in the limit of infinite searchspace dimensionality. 1
Uncertainty handling in model selection for support vector machines
 In G. Rudolph (Ed.), LNCS
, 2008
"... Abstract. We consider evolutionary model selection for support vector machines. Holdout setbased objective functions are natural model selection criteria, and we introduce a symmetrization of the standard crossvalidation approach. We propose the covariance matrix adaptation evolution strategy (CM ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We consider evolutionary model selection for support vector machines. Holdout setbased objective functions are natural model selection criteria, and we introduce a symmetrization of the standard crossvalidation approach. We propose the covariance matrix adaptation evolution strategy (CMAES) with uncertainty handling for optimizing the new randomized objective function. Our results show that this search strategy avoids premature convergence and results in improved classification accuracy compared to strategies without uncertainty handling. 1
The Steady State Behavior of (µ/µI, λ)ES on Ellipsoidal Fitness Models Disturbed by Noise
 GECCO2003: Proceedings of the Genetic and Evolutionary Computation Conference
, 2003
"... Abstract. The method of differentialgeometry is applied for deriving steady state conditions for the (µ/µI,λ)ES on the general quadratic test function disturbed by fitness noise of constant strength. A new approach for estimating the expected final fitness deviation observed under such conditions ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Abstract. The method of differentialgeometry is applied for deriving steady state conditions for the (µ/µI,λ)ES on the general quadratic test function disturbed by fitness noise of constant strength. A new approach for estimating the expected final fitness deviation observed under such conditions is presented. The theoretical results obtained are compared with real ES runs showing a surprisingly excellent agreement. 1