Results 1  10
of
30
Evolutionary computation: Comments on the history and current state
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1997
"... Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and ..."
Abstract

Cited by 274 (0 self)
 Add to MetaCart
(Show Context)
Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and the working principles of different approaches, including genetic algorithms (GA) [with links to genetic programming (GP) and classifier systems (CS)], evolution strategies (ES), and evolutionary programming (EP) by analysis and comparison of their most important constituents (i.e., representations, variation operators, reproduction, and selection mechanism). Finally, we give a brief overview on the manifold of application domains, although this necessarily must remain incomplete.
SelfAdaptive Genetic Algorithms with Simulated Binary Crossover
 COMPLEX SYSTEMS
, 1999
"... Selfadaptation is an essential feature of natural evolution. However, in the context of function optimization, selfadaptation features of evolutionary search algorithms have been explored only with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the selfa ..."
Abstract

Cited by 84 (12 self)
 Add to MetaCart
Selfadaptation is an essential feature of natural evolution. However, in the context of function optimization, selfadaptation features of evolutionary search algorithms have been explored only with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the selfadaptive feature of realparameter genetic algorithms (GAs) using simulated binary crossover (SBX) operator and without any mutation operator. The connection between the working of selfadaptive ESs and realparameter GAs with SBX operator is also discussed. Thereafter, the selfadaptive behavior of realparameter GAs is demonstrated on a number of test problems commonlyused in the ES literature. The remarkable similarity in the working principle of realparameter GAs and selfadaptive ESs shown in this study suggests the need of emphasizing further studies on selfadaptive GAs.
Contemporary Evolution Strategies
, 1995
"... After an outline of the history of evolutionary algorithms, a new (¯; ; ; ae) variant of the evolution strategies is introduced formally. Though not comprising all degrees of freedom, it is richer in the number of features than the meanwhile old (¯; ) and (¯+) versions. Finally, all important theor ..."
Abstract

Cited by 71 (2 self)
 Add to MetaCart
(Show Context)
After an outline of the history of evolutionary algorithms, a new (¯; ; ; ae) variant of the evolution strategies is introduced formally. Though not comprising all degrees of freedom, it is richer in the number of features than the meanwhile old (¯; ) and (¯+) versions. Finally, all important theoretically proven facts about evolution strategies are briefly summarized and some of many open questions concerning evolutionary algorithms in general are pointed out.
Evolution strategies for mixed–integer optimization of optical multilayer systems
 EVOLUTIONARY PROGRAMMING IV – PROC. FOURTH ANNUAL CONF. EVOLUTIONARY PROGRAMMING (EP95
, 1995
"... An extension of the evolution strategy for mixedinteger optimization problems is introduced. The resulting generalized evolution strategy is applied to the problem of optical multilayer coating design and the results are compared with results obtained by standard methods. The generalized evolution ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
An extension of the evolution strategy for mixedinteger optimization problems is introduced. The resulting generalized evolution strategy is applied to the problem of optical multilayer coating design and the results are compared with results obtained by standard methods. The generalized evolution strategy as a synthesis method does not require the existence of a starting design, and it competes well with refinement methods for the optimization of starting designs. The results are very encouraging and indicate that this method is a robust and helpful algorithm for optical multilayer design. Furthermore, the generalized evolution strategy is not a tailored heuristic but can be used for arbitrary mixedinteger optimization problems.
Particle Swarm Optimization for Integer Programming
 In Proceedings of the IEEE 2002 Congress on Evolutionary Computation
, 2002
"... The investigation of the performance of the Particle Swarm Optimization (PSO) method in Integer Programming problems, is the main theme of the present paper. Three variants of PSO are compared with the widely used Branch and Bound technique, on several Integer Programming test problems. Results indi ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
(Show Context)
The investigation of the performance of the Particle Swarm Optimization (PSO) method in Integer Programming problems, is the main theme of the present paper. Three variants of PSO are compared with the widely used Branch and Bound technique, on several Integer Programming test problems. Results indicate that PSO handles efficiently such problems, and in most cases it outperforms the Branch and Bound technique.
Memetic particle swarm optimization
, 2007
"... We propose a new Memetic Particle Swarm Optimization scheme that incorporates local search techniques in the standard Particle Swarm Optimization algorithm, resulting in an efficient and effective optimization method, which is analyzed theoretically. The proposed algorithm is applied to different u ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
We propose a new Memetic Particle Swarm Optimization scheme that incorporates local search techniques in the standard Particle Swarm Optimization algorithm, resulting in an efficient and effective optimization method, which is analyzed theoretically. The proposed algorithm is applied to different unconstrained, constrained, minimax and integer programming problems and the obtained results are compared to that of the global and local variants of Particle Swarm Optimization, justifying the superiority of the memetic approach.
M.: Effects of scalefree and smallworld topologies on binary coded selfadaptive
, 2006
"... Abstract. In this paper we investigate the properties of CEAs with populations structured as Watts–Strogatz smallworld graphs and Albert–Barabási scalefree graphs as problem solvers, using several standard discrete optimization problems as a benchmark. The EA variants employed include selfadapta ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we investigate the properties of CEAs with populations structured as Watts–Strogatz smallworld graphs and Albert–Barabási scalefree graphs as problem solvers, using several standard discrete optimization problems as a benchmark. The EA variants employed include selfadaptation of mutation rates. Results are compared with the corresponding classical panmictic EA showing that topology together with selfadaptation drastically influences the search. 1
A Partial Order Approach to Noisy Fitness Functions
 Congress on Evolutionary Computation, Seoul, Korea
, 2001
"... Introduction The Gaussian distribution is the predominant choice for modeling noise frequently observable in measurings of various kinds. Here, we hold the view that a noise distribution with unbounded support (like the Gaussian, Cauchy, Laplace, Logistic, and others) may be quite unrealistic. Actu ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Introduction The Gaussian distribution is the predominant choice for modeling noise frequently observable in measurings of various kinds. Here, we hold the view that a noise distribution with unbounded support (like the Gaussian, Cauchy, Laplace, Logistic, and others) may be quite unrealistic. Actually it is at least equally plausible to assume that the noise cannot exceed certain limits due to technical characteristics of the involved measurement unit. Even if a distributional shape close to a Gaussian appears reasonable we can resort to a symmetrical Beta distribution which can converge weakly to a Gaussian distribution under continuously increasing but bounded support (see e.g. Evans et al. 1993, p. 36). This assumption will have significant theoretical and practical impacts on the evolutionary algorithms (EAs) considered here. Traditional measures for coping with noisy fitness functions in evolutionary algorithms include the resampling of the random fitness value with averagi
MixedInteger Evolution Strategy for Chemical Plant Optimization with Simulators
, 2000
"... The optimization of chemical engineering plants is still a challenging task. Economical evaluations of a process flowsheet using rigorous simulation models are very time consuming. Furthermore, many different types of parameters can be involved into the optimization procedure, resulting in highly re ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
The optimization of chemical engineering plants is still a challenging task. Economical evaluations of a process flowsheet using rigorous simulation models are very time consuming. Furthermore, many different types of parameters can be involved into the optimization procedure, resulting in highly restricted mixedinteger nonlinear objective functions. Evolution Strategies (ES) are a promising robust and flexible optimization technique for such problems. Motivated by a typical chemical process optimization problem, in this paper a non standard ES is presented, which deals with nominal discrete, metric integer and metric continuous parameters taken from limited domains. Genetic operators from literature are combined and adapted. Experimental results on test functions and an application example  the parameter optimization of a HDA process  show the robust convergence behaviour of the algorithm even for small population sizes. 1 Introduction The availability of high spe...
On Representation and Genetic Operators in Evolutionary Algorithms
 IEEE TRANS. ON EVOLUTION COMPUTATION
, 1998
"... The application of evolutionary algorithms (EAs) requires as a basic design decision the choice of a suitable representation of the variable space and appropriate genetic operators. In practice mainly problemspecific representations with specific genetic operators and miscellaneous extensions can ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
The application of evolutionary algorithms (EAs) requires as a basic design decision the choice of a suitable representation of the variable space and appropriate genetic operators. In practice mainly problemspecific representations with specific genetic operators and miscellaneous extensions can be observed. In this connection it attracts attention that hardly any formal requirements on the genetic operators are stated. In this article we first formalize the representation problem and then propose a package of requirements to guide the design of genetic operators. By the definition of distance measures on the geno and phenotype space it is possible to integrate problemspecific knowledge into the genetic operators. As an example we show how this package of requirements can be used to design a genetic programming (GP) system for finding Boolean functions.