Results 1  10
of
74
Parameter control in evolutionary algorithms
 IEEE Transactions on Evolutionary Computation
"... Summary. The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classifica ..."
Abstract

Cited by 237 (30 self)
 Add to MetaCart
Summary. The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classification of different approaches based on a number of complementary features, and pay special attention to setting parameters onthefly. This has the potential of adjusting the algorithm to the problem while solving the problem. This paper is intended to present a survey rather than a set of prescriptive details for implementing an EA for a particular type of problem. For this reason we have chosen to interleave a number of examples throughout the text. Thus we hope to both clarify the points we wish to raise as we present them, and also to give the reader a feel for some of the many possibilities available for controlling different parameters. 1
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 191 (1 self)
 Add to MetaCart
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
Convergence Analysis of Canonical Genetic Algorithms
 IEEE Transactions on Neural Networks
, 1994
"... This paper analyzes the convergence properties of the canonical genetic algorithm (CGA) with mutation, crossover and proportional reproduction applied to static optimization problems. It is proved by means of homogeneous finite Markov chain analysis that a CGA will never converge to the global optim ..."
Abstract

Cited by 173 (0 self)
 Add to MetaCart
This paper analyzes the convergence properties of the canonical genetic algorithm (CGA) with mutation, crossover and proportional reproduction applied to static optimization problems. It is proved by means of homogeneous finite Markov chain analysis that a CGA will never converge to the global optimum regardless of the initialization, crossover operator and objective function. But variants of CGAs that always maintain the best solution in the population, either before or after selection, are shown to converge to the global optimum due to the irreducibility property of the underlying original nonconvergent CGA. These results are discussed with respect to the schema theorem. Keywords: canonical genetic algorithm, global convergence, Markov chains, schema theorem 1 Introduction Canonical genetic algorithms (CGA) as introduced in [1] are often used to tackle static optimization problems of the type maxff(b) j b 2 IB l g (1) assuming that 0 ! f(b) ! 1 for all b 2 IB l = f0; 1g l and ...
SelfAdaptation in Genetic Algorithms
 Proceedings of the First European Conference on Artificial Life
, 1992
"... Within Genetic Algorithms (GAs) the mutation rate is mostly handled as a global, external parameter, which is constant over time or exogeneously changed over time. In this paper a new approach is presented, which transfers a basic idea from Evolution Strategies (ESs) to GAs. Mutation rates are chang ..."
Abstract

Cited by 115 (2 self)
 Add to MetaCart
Within Genetic Algorithms (GAs) the mutation rate is mostly handled as a global, external parameter, which is constant over time or exogeneously changed over time. In this paper a new approach is presented, which transfers a basic idea from Evolution Strategies (ESs) to GAs. Mutation rates are changed into endogeneous items which are adapting during the search process. First experimental results are presented, which indicate that environment dependent selfadaptation of appropriate settings for the mutation rate is possible even for GAs. Furthermore, the reduction of the number of external parameters of a GA is seen as a first step towards achieving a problemdependent selfadaptation of the algorithm. Introduction Natural evolution has proven to be a powerful mechanism for emergence and improvement of the living beings on our planet by performing a randomized search in the space of possible DNAsequences. Due to this knowledge about the qualities of natural evolution, some resea...
Optimal Mutation Rates in Genetic Search
"... The optimization of a single bit string by means of iterated mutation and selection of the best (a (1+1)Genetic Algorithm) is discussed with respect to three simple tness functions: The counting ones problem, a standard binary encoded integer, and a Gray coded integer optimization problem. A mutati ..."
Abstract

Cited by 114 (0 self)
 Add to MetaCart
The optimization of a single bit string by means of iterated mutation and selection of the best (a (1+1)Genetic Algorithm) is discussed with respect to three simple tness functions: The counting ones problem, a standard binary encoded integer, and a Gray coded integer optimization problem. A mutation rate schedule that is optimal with respect to the success probabilityofmutation is presented for each of the objective functions, and it turns out that the standard binary code can hamper the search process even in case of unimodal objective functions. While normally a mutation rate of 1=l (where l denotes the bit string length) is recommendable, our results indicate that a variation of the mutation rate is useful in cases where the tness function is a multimodal pseudoboolean function, where multimodality may be caused by the objective function as well as the encoding mechanism.
Adaptive and Selfadaptive Evolutionary Computations
 Computational Intelligence: A Dynamic Systems Perspective
, 1995
"... This paper reviews the various studies that have introduced adaptive and selfadaptive parameters into Evolutionary Computations. A formal definition of an adaptive evolutionary computation is provided with an analysis of the types of adaptive and selfadaptive parameter update rules currently in use ..."
Abstract

Cited by 82 (2 self)
 Add to MetaCart
This paper reviews the various studies that have introduced adaptive and selfadaptive parameters into Evolutionary Computations. A formal definition of an adaptive evolutionary computation is provided with an analysis of the types of adaptive and selfadaptive parameter update rules currently in use. Previous studies are reviewed and placed into a categorization that helps to illustrate their similarities and differences. Introduction
SelfAdaptive Genetic Algorithms with Simulated Binary Crossover
 COMPLEX SYSTEMS
, 1999
"... Selfadaptation is an essential feature of natural evolution. However, in the context of function optimization, selfadaptation features of evolutionary search algorithms have been explored only with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the selfa ..."
Abstract

Cited by 66 (10 self)
 Add to MetaCart
Selfadaptation is an essential feature of natural evolution. However, in the context of function optimization, selfadaptation features of evolutionary search algorithms have been explored only with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the selfadaptive feature of realparameter genetic algorithms (GAs) using simulated binary crossover (SBX) operator and without any mutation operator. The connection between the working of selfadaptive ESs and realparameter GAs with SBX operator is also discussed. Thereafter, the selfadaptive behavior of realparameter GAs is demonstrated on a number of test problems commonlyused in the ES literature. The remarkable similarity in the working principle of realparameter GAs and selfadaptive ESs shown in this study suggests the need of emphasizing further studies on selfadaptive GAs.
Rigorous Hitting Times for Binary Mutations
, 1999
"... In the binary evolutionary optimization framework, two mutation operators are theoretically investigated. For both the standard mutation, in which all bits are flipped independently with the same probability, and the 1bitflip mutation, which flips exactly one bit per bitstring, the statistical dis ..."
Abstract

Cited by 60 (2 self)
 Add to MetaCart
In the binary evolutionary optimization framework, two mutation operators are theoretically investigated. For both the standard mutation, in which all bits are flipped independently with the same probability, and the 1bitflip mutation, which flips exactly one bit per bitstring, the statistical distribution of the first hitting times of the target are thoroughly computed (expectation and variance) up to terms of order l (the size of the bitstrings) in two distinct situations: without any selection, or with the deterministic (1+1)ES selection on the OneMax problem. In both cases, the 1bitflip mutation convergence time is smaller by a constant (in terms of l) multiplicative factor. These results extend to the case of multiple independent optimizers. Keywords Evolutionary algorithms, stochastic analysis, binary mutations, Markov chains, hitting times. 1 Introduction One known drawback of Evolutionary Algorithms as function optimizers is the amount of computational efforts they re...
Finite Markov Chain Results in Evolutionary Computation: A Tour d'Horizon
, 1998
"... . The theory of evolutionary computation has been enhanced rapidly during the last decade. This survey is the attempt to summarize the results regarding the limit and finite time behavior of evolutionary algorithms with finite search spaces and discrete time scale. Results on evolutionary algorithms ..."
Abstract

Cited by 57 (2 self)
 Add to MetaCart
. The theory of evolutionary computation has been enhanced rapidly during the last decade. This survey is the attempt to summarize the results regarding the limit and finite time behavior of evolutionary algorithms with finite search spaces and discrete time scale. Results on evolutionary algorithms beyond finite space and discrete time are also presented but with reduced elaboration. Keywords: evolutionary algorithms, limit behavior, finite time behavior 1. Introduction The field of evolutionary computation is mainly engaged in the development of optimization algorithms which design is inspired by principles of natural evolution. In most cases, the optimization task is of the following type: Find an element x 2 X such that f(x ) f(x) for all x 2 X , where f : X ! IR is the objective function to be maximized and X the search set. In the terminology of evolutionary computation, an individual is represented by an element of the Cartesian product X \Theta A, where A is a possibly...
Intelligent Mutation Rate Control in Canonical Genetic Algorithms
 Foundation of Intelligent Systems 9th International Symposium, ISMIS '96
, 1996
"... . The role of the mutation rate in canonical genetic algorithms is investigated by comparing a constant setting, a deterministically varying, timedependent mutation rate schedule, and a selfadaptation mechanism for individual mutation rates following the principle of selfadaptation as used in evo ..."
Abstract

Cited by 54 (0 self)
 Add to MetaCart
. The role of the mutation rate in canonical genetic algorithms is investigated by comparing a constant setting, a deterministically varying, timedependent mutation rate schedule, and a selfadaptation mechanism for individual mutation rates following the principle of selfadaptation as used in evolution strategies. The power of the selfadaptation mechanism is illustrated by a timevarying optimization problem, where mutation rates have to adapt continuously in order to follow the optimum. The strengths of the proposed deterministic schedule and the selfadaptation method are demonstrated by a comparison of their performance on difficult combinatorial optimization problems (multiple knapsack, maximum cut and maximum independent set in graphs). Both methods are shown to perform significantly better than the canonical genetic algorithm, and the deterministic schedule yields the best results of all control mechanisms compared. 1 Introduction Genetic Algorithms [11, 14] are the best kno...