Results 1 - 10
of
99
Parameter control in evolutionary algorithms
- IEEE Transactions on Evolutionary Computation
"... Summary. The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classifica ..."
Abstract
-
Cited by 365 (42 self)
- Add to MetaCart
(Show Context)
Summary. The issue of setting the values of various parameters of an evolutionary algorithm is crucial for good performance. In this paper we discuss how to do this, beginning with the issue of whether these values are best set in advance or are best changed during evolution. We provide a classification of different approaches based on a number of complementary features, and pay special attention to setting parameters on-the-fly. This has the potential of adjusting the algorithm to the problem while solving the problem. This paper is intended to present a survey rather than a set of prescriptive details for implementing an EA for a particular type of problem. For this reason we have chosen to interleave a number of examples throughout the text. Thus we hope to both clarify the points we wish to raise as we present them, and also to give the reader a feel for some of the many possibilities available for controlling different parameters. 1
Differential evolution algorithm with strategy adaptation for global numerical optimization
- IEEE Trans. Evol. Comput
, 2009
"... Abstract—Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimiza-tion problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem c ..."
Abstract
-
Cited by 125 (9 self)
- Add to MetaCart
(Show Context)
Abstract—Differential evolution (DE) is an efficient and powerful population-based stochastic search technique for solving optimiza-tion problems over continuous space, which has been widely applied in many scientific and engineering fields. However, the success of DE in solving a specific problem crucially depends on appropriately choosing trial vector generation strategies and their associated control parameter values. Employing a trial-and-error scheme to search for the most suitable strategy and its associated parameter settings requires high computational costs. Moreover, at different stages of evolution, different strategies coupled with different parameter settings may be required in order to achieve the best performance. In this paper, we propose a self-adaptive DE (SaDE) algorithm, in which both trial vector generation strategies and their associated control parameter values are gradually self-adapted by learning from their previous experiences in generating promising solutions. Consequently, a more suitable generation strategy along with its parameter settings can be determined adaptively to match different phases of the search process/evolution. The performance of the SaDE algorithm is extensively evaluated (using codes avail-able from P. N. Suganthan) on a suite of 26 bound-constrained numerical optimization problems and compares favorably with the conventional DE and several state-of-the-art parameter adaptive DE variants. Index Terms—Differential evolution (DE), global numerical optimization, parameter adaptation, self-adaptation, strategy adaptation. I.
A Tutorial for Competent Memetic Algorithms: Model, Taxonomy, and Design Issues
- IEEE Transactions on Evolutionary Computation
, 2005
"... We recommend you cite the published version. ..."
Genetic Programming and Autoconstructive Evolution with the Push Programming Language
- Genetic Programming and Evolvable Machines
, 2002
"... Push is aprogxAI""1 langxA desigxA for the expression ofevolving proging within an evolutionary computation system. This article describes Push and illustrates some of the opportunities that it presents for evolutionary computation. Two evolutionary computation systems, PushGP and Push ..."
Abstract
-
Cited by 68 (18 self)
- Add to MetaCart
Push is aprogxAI""1 langxA desigxA for the expression ofevolving proging within an evolutionary computation system. This article describes Push and illustrates some of the opportunities that it presents for evolutionary computation. Two evolutionary computation systems, PushGP and Pushpop, are described in detail. PushGP is ag-[fi88 prog8AI"y1 system that evolves Pushprog""x to solve computational problems. Pushpop, an "autoconstructive evolution" system, also evolves Push prog"[fi but does so while simultaneouslyevolving its own evolutionary mechanisms.
Adaptation in Evolutionary Computation: A Survey
- In Proceedings of the Fourth International Conference on Evolutionary Computation (ICEC 97
, 1997
"... Abstract � Adaptation of parameters and operators is one of the most important and promising areas of research in evolutionary computation � it tunes the algorithm to the problem while solving the problem. In this paper we develop a classi�cation of adaptation on the basis of the mechanisms used � a ..."
Abstract
-
Cited by 66 (6 self)
- Add to MetaCart
Abstract � Adaptation of parameters and operators is one of the most important and promising areas of research in evolutionary computation � it tunes the algorithm to the problem while solving the problem. In this paper we develop a classi�cation of adaptation on the basis of the mechanisms used � and the level at which adaptation operates within the evolutionary algorithm. The classi�cation covers all forms of adaptation in evolutionary computation and suggests fur� ther research. I.
P.J.: Tracking extrema in dynamic environments
- Proc Evolutionary Programming IV
, 1998
"... Abstract. Typical applications of evolutionary optimization involve the off-line approximation of extrema of static multi-modal functions. Methods which use a vari-ety of techniques to self-adapt mutation parameters have been shown to be more suc-cessful than methods which do not use self-adaptation ..."
Abstract
-
Cited by 60 (0 self)
- Add to MetaCart
(Show Context)
Abstract. Typical applications of evolutionary optimization involve the off-line approximation of extrema of static multi-modal functions. Methods which use a vari-ety of techniques to self-adapt mutation parameters have been shown to be more suc-cessful than methods which do not use self-adaptation. For dynamic functions, the interest is not to obtain the extrema but to follow it as closely as possible. This paper compares the on-line extrema tracking performance of an evolutionary program with-out self-adaptation against an evolutionary program using a self-adaptive Gaussian update rule over a number of dynamics applied to a simple static function. The experi-ments demonstrate that for some dynamic functions, self-adaptation is effective while for others it is detrimental. 1
The exploration/exploitation tradeoff in dynamic cellular genetic algorithms
- IEEE Transactions on Evolutionary Computation
, 2005
"... Abstract—This paper studies static and dynamic decentralized versions of the search model known as cellular genetic algorithm (cGA), in which individuals are located in a specific topology and interact only with their neighbors. Making changes in the shape of such topology or in the neighborhood may ..."
Abstract
-
Cited by 46 (8 self)
- Add to MetaCart
(Show Context)
Abstract—This paper studies static and dynamic decentralized versions of the search model known as cellular genetic algorithm (cGA), in which individuals are located in a specific topology and interact only with their neighbors. Making changes in the shape of such topology or in the neighborhood may give birth to a high number of algorithmic variants. We perform these changes in a methodological way by tuning the concept of ratio. Since the relationship (ratio) between the topology and the neighborhood shape defines the search selection pressure, we propose to analyze in depth the influence of this ratio on the exploration/exploitation tradeoff. As we will see, it is difficult to decide which ratio is best suited for a given problem. Therefore, we introduce a preprogrammed change of this ratio during the evolution as a possible additional improvement that removes the need of specifying a single ratio. A later refinement will lead us to the first adaptive dynamic kind of cellular models to our knowledge. We conclude that these dynamic cGAs have the most desirable behavior among all the evaluated ones in terms of efficiency and accuracy; we validate our results on a set of seven different problems of considerable complexity in order to better sustain our conclusions. Index Terms—Cellular genetic algorithm (cGA), evolutionary algorithm (EA), dynamic adaptation, neighborhood-to-population ratio. I.
Graph Coloring with Adaptive Evolutionary Algorithms
, 1998
"... This paper presents the results of an experimental investigation on solving graph coloring problems with Evolutionary Algorithms (EA). After testing different algorithm variants we conclude that the best option is an asexual EA using order-based representation and an adaptation mechanism that period ..."
Abstract
-
Cited by 45 (19 self)
- Add to MetaCart
This paper presents the results of an experimental investigation on solving graph coloring problems with Evolutionary Algorithms (EA). After testing different algorithm variants we conclude that the best option is an asexual EA using order-based representation and an adaptation mechanism that periodically changes the fitness function during the evolution. This adaptive EA is general, using no domain specific knowledge, except, of course, from the decoder (fitness function). We compare this adaptive EA to a powerful traditional graph coloring technique DSatur and the Grouping GA on a wide range of problem instances with different size, topology and edge density. The results show that the adaptive EA is superior to the Grouping GA and outperforms DSatur on the hardest problem instances. Furthermore, it scales up better with the problem size than the other two algorithms and indicates a linear computational complexity. Keywords: evolutionary algorithms, genetic algorithms, constraint sati...