Results 1  10
of
40
Evolutionary computation: Comments on the history and current state
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1997
"... Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and ..."
Abstract

Cited by 217 (0 self)
 Add to MetaCart
Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and the working principles of different approaches, including genetic algorithms (GA) [with links to genetic programming (GP) and classifier systems (CS)], evolution strategies (ES), and evolutionary programming (EP) by analysis and comparison of their most important constituents (i.e., representations, variation operators, reproduction, and selection mechanism). Finally, we give a brief overview on the manifold of application domains, although this necessarily must remain incomplete.
On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts  Towards Memetic Algorithms
, 1989
"... Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that ..."
Abstract

Cited by 186 (10 self)
 Add to MetaCart
Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could possibly enumerate 10 9 tours per second on a computer it would thus take roughly 10 639 years of computing to establish the optimality of this tour by exhaustive enumeration." This quote shows the real difficulty of a combinatorial optimization problem. The huge number of configurations is the primary difficulty when dealing with one of these problems. The quote belongs to M.W Padberg and M. Grotschel, Chap. 9., "Polyhedral computations", from the book The Traveling Salesman Problem: A Guided tour of Combinatorial Optimization [124]. It is interesting to compare the number of configurations of realworld problems in combinatorial optimization with those large numbers arising in Cosmol...
Genetic Algorithms for the Travelling Salesman Problem: A Review of Representations and Operators
 Artificial Intelligence Review
, 1999
"... This paper is the result of a literature study carried out by the authors. It is a review of the dierent attempts made to solve the Travelling Salesman Problem with Genetic Algorithms. We present crossover and mutation operators, developed to tackle the Travelling Salesman Problem with Genetic Alg ..."
Abstract

Cited by 66 (2 self)
 Add to MetaCart
This paper is the result of a literature study carried out by the authors. It is a review of the dierent attempts made to solve the Travelling Salesman Problem with Genetic Algorithms. We present crossover and mutation operators, developed to tackle the Travelling Salesman Problem with Genetic Algorithms with dierent representations such as: binary representation, path representation, adjacency representation, ordinal representation and matrix representation. Likewise, we show the experimental results obtained with dierent standard examples using combination of crossover and mutation operators in relation with path representation. Keywords: Travelling Salesman Problem; Genetic Algorithms; Binary representation; Path representation; Adjacency representation; Ordinal representation; Matrix representation; Hybridation. 1 1 Introduction In nature, there exist many processes which seek a stable state. These processes can be seen as natural optimization processes. Over the last...
Memetic Algorithms for Combinatorial Optimization Problems: Fitness Landscapes and Effective Search Strategies
, 2001
"... ..."
Applying evolutionary programming to selected traveling salesman problems
 Cybernetics f!Y Systems
, 1993
"... AbstractEvolutionary programming is a stochastic optimization procedure that can be applied to difficult combinatorial problems. Experiments are conducted with three standard optimal control problems (linearquadratic, harvest, and pushcart). The results are compared to those obtained with genetic ..."
Abstract

Cited by 42 (0 self)
 Add to MetaCart
AbstractEvolutionary programming is a stochastic optimization procedure that can be applied to difficult combinatorial problems. Experiments are conducted with three standard optimal control problems (linearquadratic, harvest, and pushcart). The results are compared to those obtained with genetic algorithms and the General Algebraic Modeling System (GAMS), a numerical optimization software package. The results indicate that evolutionary programming generally outperforms genetic algorithms. Evolutionary programming also compares well with GAMS on certain problems for which GAMS is specifically designed and outperforms GAMS on other problems. The computational requirements for each procedure are briefly discussed.
An Empirical Comparison of Selection Methods in Evolutionary Algorithms
, 1994
"... Selection methods in Evolutionary Algorithms, including Genetic Algorithms, Evolution Strategies (ES) and Evolutionary Programming, (EP) are compared by observing the rate of convergence on three idealised problems. The first considers selection only, the second introduces mutation as a source of va ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Selection methods in Evolutionary Algorithms, including Genetic Algorithms, Evolution Strategies (ES) and Evolutionary Programming, (EP) are compared by observing the rate of convergence on three idealised problems. The first considers selection only, the second introduces mutation as a source of variation, the third also adds in evaluation noise. Fitness proportionate selection suffers from scaling problems: a number of techniques to reduce these are illustrated. The sampling errors caused by roulette wheel and tournament selection are demonstrated. The EP selection model is shown to be almost equivalent to an ES model in one form, and surprisingly similar to fitness proportionate selection in another. Generational models are shown to be remarkably immune to evaluation noise, models that retain parents much less so. 1 Introduction Selection provides the driving force in an evolutionary algorithm (EA) and the selection pressure is a critical parameter. Too much, and the search will te...
An evolutionary programming approach to selfadaptation on finite state machines
 Proceedings of the 4th Annual Conference on Evolutionary Programming
, 1995
"... Evolutionary programming was first offered as an alternative method for generating artificial intelligence. Experiments were offered in which finite state machines were used to predict time series with respect to an arbitrary payoff function. Mutations were imposed on the evolving machines such that ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
Evolutionary programming was first offered as an alternative method for generating artificial intelligence. Experiments were offered in which finite state machines were used to predict time series with respect to an arbitrary payoff function. Mutations were imposed on the evolving machines such that each of the possible modes of variation were given equal probability. The current study investigates the use of selfadaptive methods of evolutionary programming on finite state machines. Each machine incorporates a coding for its structure and an additional set of parameters that determine in part how it will distribute new trials. Two methods for accomplishing this selfadaptation are implemented and tested on two simple prediction problems. The results appear to favor the use of such selfadaptive methods.
Memetic Algorithms for the Traveling Salesman Problem
 Complex Systems
, 1997
"... this paper, the tness landscapes of several instances of the traveling salesman problem (TSP) are investigated to illustrate why MAs are wellsuited for nding nearoptimum tours for the TSP. It is shown that recombination{based MAs can exploit the correlation structure of the landscape. A comparis ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
this paper, the tness landscapes of several instances of the traveling salesman problem (TSP) are investigated to illustrate why MAs are wellsuited for nding nearoptimum tours for the TSP. It is shown that recombination{based MAs can exploit the correlation structure of the landscape. A comparison of several recombination operators { including a new generic recombination operator { reveals that when using the sophisticated Lin{Kernighan local search, the performance dierence of the MAs is small. However, the most important property of eective recombination operators is shown to be respectfulness. In experiments it is shown that our MAs with generic recombination are among the best evolutionary algorithms for the TSP. In particular, optimum solutions could be found up to a problem size of 3795, and for large instances up to 85,900 cities, nearoptimum solutions could be found in a reasonable amount of time
An Empirical Study of Genetic Operators in Genetic Algorithms
, 1993
"... Genetic algorithms are multiagent search strategies applicable to a wide range of problems. However, it is often very difficult in practice to design an optimal set of genetic operators for a problem, because a genetic operator which works well for a problem might not work well for another problem. ..."
Abstract

Cited by 19 (15 self)
 Add to MetaCart
Genetic algorithms are multiagent search strategies applicable to a wide range of problems. However, it is often very difficult in practice to design an optimal set of genetic operators for a problem, because a genetic operator which works well for a problem might not work well for another problem. This paper tries to understand when and why some genetic operators are useful and how we can combine them together to improve the performance of genetic algorithms. Experiments have been carried out to analyse the role of crossover and mutation as well as selection mechanisms. Several genetic algorithms with different genetic operators and selection mechanisms are used in the empirical study. The study suggests that "greedy" crossover and "hard" selection with a low mutation rate often give genetic algorithms better performance.
Inverover Operator for the TSP
 in Proc. PPSN
, 1998
"... gt�rjgc.whu.edu.cn Abstract. In this paper we investigate the usefulness of a new opera� tor � inver�over � for an evolutionary algorithm for the TSP. Inver�over is based on simple inversion � however � knowledge taken from other indi� viduals in the population in�uences its action. Thus � on one ha ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
gt�rjgc.whu.edu.cn Abstract. In this paper we investigate the usefulness of a new opera� tor � inver�over � for an evolutionary algorithm for the TSP. Inver�over is based on simple inversion � however � knowledge taken from other indi� viduals in the population in�uences its action. Thus � on one hand � the proposed operator is unary � since the inversion is applied to a segment of a single individual � however � the selection of a segment to be inverted is population driven � thus the operator displays some characterictics of recombination. This operator outperforms all other �genetic � operators � whether unary or binary � which have been proposed in the past for the TSP in connection with evolutionary systems and the resulting evolutionary algorithm is very fast. For test cases � where the number of cities is around 100 � the algorithm reaches the optimum in every execution in a couple of seconds. For larger instances �e.g. � 10�000 cities � the results stay within 3 � from the estimated optimum. 1