## Parameter control in evolutionary algorithms

### Cached

### Download Links

- [www.cs.vu.nl]
- [hal.inria.fr]
- [hal.archives-ouvertes.fr]
- [www.evalife.dk]
- [www.cs.vu.nl]
- [www.cs.vu.nl]
- [www.macs.hw.ac.uk]
- [cours.etsmtl.ca]
- DBLP

### Other Repositories/Bibliography

Venue: | IEEE Transactions on Evolutionary Computation |

Citations: | 256 - 31 self |

### BibTeX

@ARTICLE{Eiben_parametercontrol,

author = {A. E. Eiben and Z. Michalewicz and M. Schoenauer and J. E. Smith},

title = {Parameter control in evolutionary algorithms},

journal = {IEEE Transactions on Evolutionary Computation},

year = {}

}

### Years of Citing Articles

### OpenURL

### Abstract

Summary. The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classification of different approaches based on a number of complementary features, and pay special attention to setting parameters on-the-fly. This has the potential of adjusting the algorithm to the problem while solving the problem. This paper is intended to present a survey rather than a set of prescriptive details for implementing an EA for a particular type of problem. For this reason we have chosen to interleave a number of examples throughout the text. Thus we hope to both clarify the points we wish to raise as we present them, and also to give the reader a feel for some of the many possibilities available for controlling different parameters. 1

### Citations

8169 |
Genetic Algorithms
- Goldberg
- 1989
(Show Context)
Citation Context ...eward is a shift up in probability at the cost of other operators [17]. This, actually, is very close in spirit to the “implicit bucket brigade” credit assignment principle used in classifier systems =-=[33]-=-. The GA using this method applies several crossover operators simultaneously within the same generation, each having its own crossover rate pc(opi). Additionally, each operator has its “local delta” ... |

1292 | Handbook of genetic algorithms
- Davis
- 1991
(Show Context)
Citation Context ...er offspring. This reward is diminishingly propagated back to operators of a few generations back, who helped setting it all up; the reward is a shift up in probability at the cost of other operators =-=[17]-=-. This, actually, is very close in spirit to the “implicit bucket brigade” credit assignment principle used in classifier systems [33]. The GA using this method applies several crossover operators sim... |

920 |
Evolutionary Algorithms in Theory and Practice
- Back
- 1996
(Show Context)
Citation Context ... J.E. Smith This is what has often been stated as “mutation parameters are optimized for free” by the evolution itself. And indeed, SA-ES have long been the state-ofthe-art in parametric optimization =-=[9]-=-. But what are “good” mutation parameters? The issue has already been discussed for the step-size in previous section, and similar arguments can be given for the covariance matrix itself. Replace the ... |

916 | An analysis of the behavior of a class of genetic adaptive systems - Jong - 1975 |

431 | Handbook of Evolutionary Computation
- Bäck, Fogel, et al.
- 1997
(Show Context)
Citation Context ...formance over a wide range of problems [33, page 6]. The contemporary view on EAs, however, acknowledges that specific problems (problem types) require specific EA setups for satisfactory performance =-=[12]-=-. Thus, the scope of “optimal” parameter settings is necessarily narrow. There are also theoretical arguments that any quest for generally good EA, thus generally good parameter settings, is lost a pr... |

423 |
Optimisation Of Control Parameters For Genetic Algorithm
- Grefenstette
- 1986
(Show Context)
Citation Context ...of [18], determining recommended values for the probabilities of single-point crossover and bit mutation on what is now called the DeJong test suite of five functions. About this and similar attempts =-=[34, 62]-=-, it should be noted that genetic algorithms used to be seen as robust problem solvers that exhibit approximately the same performance over a wide range of problems [33, page 6]. The contemporary view... |

414 |
Simulated Annealing and Boltzmann Machines
- Aarts, Korst
- 1989
(Show Context)
Citation Context ... variable selection pressure in the survivor selection (replacement) step by simulated annealing (SA). SA is a generate-and-test search technique based on a physical, rather than a biological analogy =-=[1]-=-. Formally, however, SA can be envisioned as an evolutionary process with population size of 1, undefined (problem-dependent) representation and mutation, and a specific survivor selection mechanism. ... |

411 | Introduction to Evolutionary Computing
- Eiben, Smith
- 2003
(Show Context)
Citation Context ...me of the many possibilities available for controlling different parameters. 1 Introduction Finding the appropriate setup for an evolutionary algorithm is a long standing grand challenge of the field =-=[21, 25]-=-. The main problem is that the description of a specific EA contains its components, such as the choice of representation,selection, recombination, and mutation operators, thereby setting a framework ... |

358 | Completely derandomized self-adaptation in evolution strategies
- Hansen, Ostermeier
(Show Context)
Citation Context ... [38], and later addressed the adaptation of the full covariance matrix [36]. The complete Covariance Matrix Adaptation (CMA-ES) algorithm was finally detailed (and its parameters carefully tuned) in =-=[37]-=- and an improvement for the update of the covariance matrix was proposed in [35]. The basic idea in CMA-ES is to use the path followed by the algorithm to deterministically update the different mutati... |

286 |
Genetic Programming: An Introduction
- Banzhaf, Nordin, et al.
- 1998
(Show Context)
Citation Context ...read out over the search space has a greater chance of finding an optimal solution than one that is concentrated in a small area of the search space. The significance of this concern is recognised in =-=[14, 19, 87]-=-. A method that dynamically increases the diversity of a DyFor GP population is proposed to accomplish these objectives. The dynamic increase of diversity (DIOD) method increases diversity by building... |

197 |
An analysis of the behaviour of a class of genetic adaptive systems, Doctoral thesis
- Jong
- 1975
(Show Context)
Citation Context ...istory of EAs considerable effort has been spent on finding parameter values (for a given type of EA, such as GAs), that were good for a number of test problems. A well-known early example is that of =-=[18]-=-, determining recommended values for the probabilities of single-point crossover and bit mutation on what is now called the DeJong test suite of five functions. About this and similar attempts [34, 62... |

188 |
Adapting operator probabilities in genetic algorithms
- Davis
- 1989
(Show Context)
Citation Context ... Additionally, it is intuitively obvious, and has been empirically and theoretically demonstrated, that different values of parameters might be optimal at different stages of the evolutionary process =-=[6, 7, 8, 16, 39, 45, 63, 66, 68, 72, 73, 78, 79]-=-. To give an example, large mutation steps can be good in the early generations, helping the exploration of the search space, and small mutation steps might be needed in the late generations to help f... |

171 | Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation
- Hansen, Ostermeier
- 1996
(Show Context)
Citation Context ...parameters in ES, hence heading back to an adaptive method for parameter tuning. Their method was first limited to the step-size [38], and later addressed the adaptation of the full covariance matrix =-=[36]-=-. The complete Covariance Matrix Adaptation (CMA-ES) algorithm was finally detailed (and its parameters carefully tuned) in [37] and an improvement for the update of the covariance matrix was proposed... |

121 | Optimal Mutation Rates in Genetic Search
- Back
- 1993
(Show Context)
Citation Context ... Additionally, it is intuitively obvious, and has been empirically and theoretically demonstrated, that different values of parameters might be optimal at different stages of the evolutionary process =-=[6, 7, 8, 16, 39, 45, 63, 66, 68, 72, 73, 78, 79]-=-. To give an example, large mutation steps can be good in the early generations, helping the exploration of the search space, and small mutation steps might be needed in the late generations to help f... |

118 | Self-adaptation in genetic algorithms
- Bäck
- 1992
(Show Context)
Citation Context ... Additionally, it is intuitively obvious, and has been empirically and theoretically demonstrated, that different values of parameters might be optimal at different stages of the evolutionary process =-=[6, 7, 8, 16, 39, 45, 63, 66, 68, 72, 73, 78, 79]-=-. To give an example, large mutation steps can be good in the early generations, helping the exploration of the search space, and small mutation steps might be needed in the late generations to help f... |

106 | Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES
- Hansen, Müller, et al.
(Show Context)
Citation Context ...e complete Covariance Matrix Adaptation (CMA-ES) algorithm was finally detailed (and its parameters carefully tuned) in [37] and an improvement for the update of the covariance matrix was proposed in =-=[35]-=-. The basic idea in CMA-ES is to use the path followed by the algorithm to deterministically update the different mutation parameters, and a simplified view is given by the following: suppose that the... |

94 |
The Interaction of Mutation Rate, Selection & Self-Adaptation within a Genetic Algorithm
- Back
- 1992
(Show Context)
Citation Context |

86 | Adaptive and Self-Adaptive Evolutionary Computation,” in Marimuthu Palaniswami et al
- Angeline
- 1995
(Show Context)
Citation Context ...ring the run Parameter tuning Parameter control Deterministic Adaptive Self−adaptive Fig. 1. Global taxonomy of parameter setting in EAs Some authors have introduced a different terminology. Angeline =-=[2]-=- distinguished “absolute” and “empirical” rules, which correspond to the “uncoupled” and “tightly-coupled” mechanisms of Spears [76]. Let us note that the uncoupled/absolute category encompasses deter... |

70 | Real-coded genetic algorithms with simulated binary crossover: Studies on multi-modal and multi-objective problems
- Deb, Kumar
- 1995
(Show Context)
Citation Context ...ation should progress slower along the directions of steepest descent of H: the covariance matrix should be proportional to H−1. And whereas the step-size actually self-adapts to quasi-optimal values =-=[9, 19]-=-, the covariance matrix that is learned by the correlated SA-ES is not the actual inverse of the Hessian [4]. 2.4 CMA-ES: a clever adaptation Another defect of SA-ES is the relative slowness of adapta... |

70 | Reducing bloat and promoting diversity using multi-objective methods
- Jong, Watson, et al.
(Show Context)
Citation Context ...read out over the search space has a greater chance of finding an optimal solution than one that is concentrated in a small area of the search space. The significance of this concern is recognised in =-=[14, 19, 87]-=-. A method that dynamically increases the diversity of a DyFor GP population is proposed to accomplish these objectives. The dynamic increase of diversity (DIOD) method increases diversity by building... |

65 | Two Self-Adaptive Crossover Operators for Genetic Programming - Angeline - 1996 |

62 | The behavior of adaptive systems which employ genetic and correlation algorithms. Doctoral dissertation - Bagley - 1967 |

57 | Intelligent mutation rate control in canonical GAs - Bäck, Schütz - 1996 |

57 | An Overview of genetic algorithms: Part 2, Research Topics. University Computing - Beasley, DR, et al. - 1993 |

55 | Comparing Genetic Operators with Gaussian Mutations
- Fogel, Atmar
- 1990
(Show Context)
Citation Context ...ates that the same parameter (encoded in the chromosomes) can be interpreted in different ways, leading to different algorithm variants with different scopes of this parameter. Spears [76], following =-=[30]-=-, experimented with individuals containing an extra bit to determine whether one-point crossover or uniform crossover is to be used (bit 1/0 standing for one-point/uniform crossover, respectively). Tw... |

51 | Tracking Extrema in Dynamic Environments - Angeline - 1997 |

51 |
Towards an Optimal Mutation Probability for Genetic Algorithms. Proceedings of 1st workshop : Parallel problem solving from nature
- Hesser, &Männer
- 1991
(Show Context)
Citation Context |

47 | On the behavior of evolutionary algorithms in dynamic environements - Bäck - 1998 |

39 |
Hadj-Alouane,.A dual genetic algorithm for bounded integer programs
- Bean, B
(Show Context)
Citation Context ...th the evolution time provided 1 ≤ C,α. Second, let us consider another option, which utilises feedback from the search process. One example of such an approach was developed by Bean and Hadj-Alouane =-=[14]-=-, where each individual is evaluated by the same formula as before, but W(t) is updated in every generation t in the following way: ⎧ ⎪⎨ (1/β1) · W(t) if b W(t + 1) = ⎪⎩ i ∈ F for allt − k + 1 ≤ i ≤ t... |

27 |
der Vaart. An empirical study on GAs “without parameters
- Bäck, Eiben, et al.
- 2000
(Show Context)
Citation Context ...ment mechanism for all parameters here is adaptive, based on relative evidence. Mutation, crossover, and population size are all controlled on-the-fly in the GA “without parameters” of Bäck et al. in =-=[11]-=-. Here, the self-adaptive mutation from [6] (Sect. 5.3) is adopted without changes, a new self-adaptive technique is invented for regulating the crossover rates of the individuals, and the GAVaPS life... |

26 | An Analysis of Selection Procedures with Particular Attention Paid to proportional and Boltzmann Selection - Maza, M, et al. - 1993 |

25 |
GAVaPS – a genetic algorithm with varying population size
- Arabas, Michalewicz, et al.
- 1994
(Show Context)
Citation Context ...ill be accepted, thus reintroducing diversity and offering a potential means of escaping from local optima. 5.6 Population An innovative way to control the population size is offered by Arabas et al. =-=[3, 59]-=- in their GA with variable population size (GAVaPS). In fact, the population size parameter is removed entirely from GAVaPS, rather than adjusted on-the-fly. Certainly, in an evolutionary algorithm th... |

25 | Global properties of evolution processes - Bremermann, Rogson, et al. - 1966 |

24 | Analysis of selection algorithms: A Markov chain approach - Chakraborty, Deb, et al. - 1996 |

23 | Genetic algorithms for function optimization. Unpublished doctoral dissertation - Brindle - 1981 |

22 | A superior evolutionary algorithm for 3-sat - Bäck, Eiben, et al. - 1998 |

19 |
der Hauw. Solving 3-SAT with Adaptive Genetic Algorithms
- Eiben, Van
- 1997
(Show Context)
Citation Context ...iate weights requires much insight into the given problem instance, and therefore it might not be practicable. The stepwise adaptation of weights (SAW) mechanism, introduced by Eiben and van der Hauw =-=[26]-=- as an improved version of the weight adaptation mechanism of Eiben, Raué, and Ruttkay [23, 24], provides a simple and effective way to set these weights. The basic idea behind the SAW mechanism is18... |

18 | Solving small and large constraint satisfaction problems using a heuristic-based microgenetic algorithms - Dozier, Bowen, et al. |

15 | Solving randomly generated constraint satisfaction problems using a micro-evolutionary hybrid that evolves a population of hill-climbers - Dozier, Bowen, et al. - 1995 |

15 | SAW-ing EAs: adapting the fitness function for solving constrained problems
- Eiben, Hemert
(Show Context)
Citation Context ...reby eliminating a possible source of error. Furthermore, the used weights reflect the difficulty of constraints for the given algorithm on the given problem instance in the given stage of the search =-=[27]-=-. This property is also valuable since, in principle, different weights could be appropriate for different algorithms. 5.3 Mutation A large majority of work on adapting or self-adapting EA parameters ... |

13 | Self-adaptivity for constraint satisfaction: Learning penalty functions
- Eiben, Ruttkay
- 1996
(Show Context)
Citation Context ...not be practicable. The stepwise adaptation of weights (SAW) mechanism, introduced by Eiben and van der Hauw [26] as an improved version of the weight adaptation mechanism of Eiben, Raué, and Ruttkay =-=[23, 24]-=-, provides a simple and effective way to set these weights. The basic idea behind the SAW mechanism is18 A.E. Eiben, Z. Michalewicz, M. Schoenauer, and J.E. Smith that constraints that are not satisf... |

13 |
Optimization of Genetic Algorithms by Genetic Algorithms, Arti cial Neural Nets and Genetic Algorithms
- Freisleben, Hartfelder
- 1993
(Show Context)
Citation Context ... is thus a natural idea to use an EA for tuning an EA to a particular problem. This could be done using two EAs: one for problem solving and another one – the so-called metaEA – to tune the first one =-=[32, 34, 48]-=-. It could also be done by using only one EA that tunes itself to a given problem, while solving that problem. Selfadaptation, as introduced in Evolution Strategies for varying the mutation parameters... |

11 | A comparative study of a penalty function, a repair heuristic, and stochastic operators with the set-covering problem
- Bäck, Schütz, et al.
- 1996
(Show Context)
Citation Context ...o each individual. Mutating an individual then amounts to first mutating the mutation parameters themselves, and then mutating the variables using the new mutation parameters. Details can be found in =-=[66, 13]-=-. The rationale for SA-ES are that algorithm relies on the selection step to keep in the population not only the best fit individuals, but also the individuals with the best mutation parameters – acco... |

10 | Evolutionary algorithms and constraint satisfaction: Definitions, survey, methodology, and research directions
- Eiben
(Show Context)
Citation Context ...timisation) problem at hand with a simple transformation of the objective function. In the class of constraint satisfaction problems, however, there is no objective function in the problem definition =-=[20]-=-. Rather, these are normally posed as decision problems with an Boolean outcome φ denoting whether a given assignment of variables represents a validParameter Control in Evolutionary Algorithms 17 BE... |

8 | Multi-parent recombination - Eiben - 1997 |

7 | Adapting the function in GP for data mining - Eggermont, Eiben, et al. - 1999 |

7 | GA-easy and ga-hard constraint satisfaction problems
- Eiben, Raue, et al.
- 1995
(Show Context)
Citation Context ...not be practicable. The stepwise adaptation of weights (SAW) mechanism, introduced by Eiben and van der Hauw [26] as an improved version of the weight adaptation mechanism of Eiben, Raué, and Ruttkay =-=[23, 24]-=-, provides a simple and effective way to set these weights. The basic idea behind the SAW mechanism is18 A.E. Eiben, Z. Michalewicz, M. Schoenauer, and J.E. Smith that constraints that are not satisf... |

5 | Mutation parameters - Back - 2000 |

5 | Looking Around: Using clues from the data space to guide genetic algorithm searches - Cartwright, Mott - 1991 |

4 | Dimension-independent convergence rate for non-isotropic (1, λ) − es
- Auger, Bris, et al.
- 2003
(Show Context)
Citation Context ...onal to the distance to the optimum. Details can be found in the studies of the so-called progress rate: early work was done by Schwefel [66], completed and extended by Beyer and recent work by Auger =-=[5]-=- gave a formal global convergence proof of this ... impractical algorithm: indeed, the distance to the optimum is not known real situations! But another piece of information is always available to the... |