Results 1 
9 of
9
A Method for Handling Uncertainty in Evolutionary Optimization with an Application to Feedback Control of Combustion
"... Abstract — We present a novel method for handling uncertainty in evolutionary optimization. The method entails quantification and treatment of uncertainty and relies on the rank based selection operator of evolutionary algorithms. The proposed uncertainty handling is implemented in the context of th ..."
Abstract

Cited by 50 (14 self)
 Add to MetaCart
Abstract — We present a novel method for handling uncertainty in evolutionary optimization. The method entails quantification and treatment of uncertainty and relies on the rank based selection operator of evolutionary algorithms. The proposed uncertainty handling is implemented in the context of the Covariance Matrix Adaptation Evolution Strategy (CMAES) and verified on test functions. The present method is independent of the uncertainty distribution, prevents premature convergence of the evolution strategy and is well suited for online optimization as it requires only a small number of additional function evaluations. The algorithm is applied in an experimental setup to the online optimization of feedback controllers of thermoacoustic instabilities of gas turbine combustors. In order to mitigate these instabilities, gaindelay or modelbased H ∞ controllers sense the pressure and command secondary fuel injectors. The parameters of these controllers are usually specified via a trial and error procedure. We demonstrate that their online optimization with the proposed methodology enhances, in an automated fashion, the online performance of the controllers, even under highly unsteady operating conditions, and it also compensates for uncertainties in the modelbuilding and design process. I.
Noisy optimization with evolution strategies
 SIAM Journal on Optimization
"... Evolution strategies are general, natureinspired heuristics for search and optimization. Supported both by empirical evidence and by recent theoretical findings, there is a common belief that evolution strategies are robust and reliable, and frequently they are the method of choice if neither deriv ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
(Show Context)
Evolution strategies are general, natureinspired heuristics for search and optimization. Supported both by empirical evidence and by recent theoretical findings, there is a common belief that evolution strategies are robust and reliable, and frequently they are the method of choice if neither derivatives of the objective function are at hand nor differentiability and numerical accuracy can be assumed. However, despite their widespread use, there is little exchange between members of the “classical ” optimization community and people working in the field of evolutionary computation. It is our belief that both sides would benefit from such an exchange. In this paper, we present a brief outline of evolution strategies and discuss some of their properties in the presence of noise. We then empirically demonstrate that for a simple but nonetheless nontrivial noisy objective function, an evolution strategy outperforms other optimization algorithms designed to be able to cope with noise. The environment in which the algorithms are tested is deliberately chosen to afford a transparency of the results that reveals the strengths and shortcomings of the strategies, making it possible to draw conclusions with regard to the design of better optimization algorithms for noisy environments. 1
On the Utility of Populations
, 2000
"... Evolutionary algorithms (EAs) are populationbased search heuristics often used for function optimization. Typically they use selection, mutation, and crossover as search operators. On many test functions EAs are outperformed by simple hillclimbers. Therefore, it is investigated whether the use o ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Evolutionary algorithms (EAs) are populationbased search heuristics often used for function optimization. Typically they use selection, mutation, and crossover as search operators. On many test functions EAs are outperformed by simple hillclimbers. Therefore, it is investigated whether the use of a population and crossover is at all advantageous. In this paper it is rigorously proven that the use of a population instead of just a single individuum can be an advantage of its own even without making use of crossover. This establishes the advantage of EAs compared to (random) hillclimbers on appropriate objective functions.
On the Utility of Populations in Evolutionary Algorithms
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) 2001
, 2001
"... Evolutionary algorithms (EAs) are populationbased search heuristics often used for function optimization. Typically they employ selection, crossover, and mutation as search operators. It is known that EAs are outperformed by simple hillclimbers in some cases. Thus, it may be asked whether th ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Evolutionary algorithms (EAs) are populationbased search heuristics often used for function optimization. Typically they employ selection, crossover, and mutation as search operators. It is known that EAs are outperformed by simple hillclimbers in some cases. Thus, it may be asked whether the use of a population and crossover is at all advantageous. In this paper it is rigorously proven that the use of a population instead of just a single individual can be an advantage of its own even without making use of crossover. This establishes by example the advantage of EAs compared to (random) hillclimbers on appropriate objective functions. Moreover, we describe one particular situation where intuitively a population should outperform a single individual and present a formal proof justifying this intuition. 1
Evolutionary optimization of feedback controllers for thermoacoustic instabilities
 IUTAM Symposium on Flow Control and MEMS, Proceedings of the IUTAM Symposium held at the Royal Geographical Society, 1922 September 2006, hosted by Imperial
, 2008
"... Abstract. We present the system identication and the online optimization of feedback controllers applied to combustion systems using evolutionary algorithms. The algorithm is applied to gas turbine combustors that are susceptible to thermoacoustic instabilities resulting in imperfect combustion and ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We present the system identication and the online optimization of feedback controllers applied to combustion systems using evolutionary algorithms. The algorithm is applied to gas turbine combustors that are susceptible to thermoacoustic instabilities resulting in imperfect combustion and decreased lifetime. In order to mitigate these pressure oscillations, feedback controllers sense the pressure and command secondary fuel injectors. The controllers are optimized online with an extension of the CMA evolution strategy capable of handling noise associated with the uncertainties in the pressure measurements. The presented method is independent of the specic noise distribution and prevents premature convergence of the evolution strategy. The proposed algorithm needs only two additional function evaluations per generation and is therefore particularly suitable for online optimization. The algorithm is experimentally veried on a gas turbine combustor test rig. The results show that the algorithm can improve the performance of controllers online and is able to cope with a variety of time dependent operating conditions. 1
Evolutionary Optimization with Cumulative Step Length Adaptation: A Performance Analysis
"... Iterative algorithms for numerical optimization in continuous spaces typically need to adapt their step lengths in the course of the search. While some strategies employ fixed schedules for reducing the step lengths over time, others attempt to adapt interactively in response to either the outcom ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Iterative algorithms for numerical optimization in continuous spaces typically need to adapt their step lengths in the course of the search. While some strategies employ fixed schedules for reducing the step lengths over time, others attempt to adapt interactively in response to either the outcome of trial steps or to the history of the search process. Evolutionary algorithms are of the latter kind. One of the control strategies that is commonly used in evolution strategies is the cumulative step length adaptation approach. This paper presents a first theoretical analysis of that adaptation strategy by considering the algorithm as a dynamical system. The analysis includes the practically relevant case of noise interfering in the optimization process. Recommendations are made with respect to the problem of choosing appropriate popula tion sizes.
On the Benefits of Distributed Populations for Noisy Optimization
"... While in the absence of noise, no improvement in local performance can be gained from retaining but the best candidate solution found so far, it has been shown experimentally that in the presence of noise, operating with a nonsingular population of candidate solutions can have a marked and posit ..."
Abstract
 Add to MetaCart
(Show Context)
While in the absence of noise, no improvement in local performance can be gained from retaining but the best candidate solution found so far, it has been shown experimentally that in the presence of noise, operating with a nonsingular population of candidate solutions can have a marked and positive effect on the local performance of evolution strategies. So as to determine the reasons for the improved performance, we study the evolutionary dynamics of the (if, ,X)ES in the presence of noise. Considering a simple, idealized environment, a momentbased approach that utilizes recent results involving concomitants of selected order statistics is developed. This approach yields an intuitive explanation for the performance advantage of multiparent strategies in the presence of noise. It is then shown that the idealized dynamic process considered does bear relevance to optimization problems in highdimensional search spaces.
Investigation of the (µ,lambda)ES in the Presence of Noise
"... this paper, we attempt to shed some light on the reasons for the potential performance improvement. In particular, we derive a progress law for the ### ##ES on a noisy linear fitness function and both numerically and empirically study its implications. We then discuss the significance of the prog ..."
Abstract
 Add to MetaCart
this paper, we attempt to shed some light on the reasons for the potential performance improvement. In particular, we derive a progress law for the ### ##ES on a noisy linear fitness function and both numerically and empirically study its implications. We then discuss the significance of the progress coefficients that have been obtained on the linear function for the quadratic sphere. Comparisons of the local performance of the ### ##ES and of the ## # ##ES and the #####ES are presented
Errata/Addenda for A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion
, 2010
"... The formula under point 3) must have read γi ← γi × 1.1 min(1,µeff /(10n)) where a max is replaced by a min. The same error was found in a most recent implementation. 2 ..."
Abstract
 Add to MetaCart
(Show Context)
The formula under point 3) must have read γi ← γi × 1.1 min(1,µeff /(10n)) where a max is replaced by a min. The same error was found in a most recent implementation. 2