Results 1  10
of
256
Evolving Artificial Neural Networks
, 1999
"... This paper: 1) reviews different combinations between ANN's and evolutionary algorithms (EA's), including using EA's to evolve ANN connection weights, architectures, learning rules, and input features; 2) discusses different search operators which have been used in various EA's; and 3) points out po ..."
Abstract

Cited by 411 (6 self)
 Add to MetaCart
This paper: 1) reviews different combinations between ANN's and evolutionary algorithms (EA's), including using EA's to evolve ANN connection weights, architectures, learning rules, and input features; 2) discusses different search operators which have been used in various EA's; and 3) points out possible future research directions. It is shown, through a considerably large literature review, that combinations between ANN's and EA's can lead to significantly better intelligent systems than relying on ANN's or EA's alone
Evolutionary Programming Made Faster
 IEEE Transactions on Evolutionary Computation
, 1999
"... Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In this paper, a "fast EP" (FEP) is proposed which uses a Cauchy instead of Ga ..."
Abstract

Cited by 206 (36 self)
 Add to MetaCart
Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In this paper, a "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator. The relationship between FEP and classical EP (CEP) is similar to that between fast simulated annealing and the classical version. Both analytical and empirical studies have been carried out to evaluate the performance of FEP and CEP for different function optimization problems. This paper shows that FEP is very good at search in a large neighborhood while CEP is better at search in a small local neighborhood. For a suite of 23 benchmark problems, FEP performs much better than CEP for multimodal functions with many local minima while being comparable to CEP in performance for unimodal and multimodal functions with only a few local minima. This paper also shows the relationship between the search step size and the probability of finding a global optimum and thus explains why FEP performs better than CEP on some functions but not on others. In addition, the importance of the neighborhood size and its relationship to the probability of finding a nearoptimum is investigated. Based on these analyses, an improved FEP (IFEP) is proposed and tested empirically. This technique mixes different search operators (mutations). The experimental results show that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.
A cooperative coevolutionary approach to function optimization
, 1994
"... Abstract. A general model for the coevolution of cooperating species is presented. This model is instantiated and tested in the domain of function optimization, and compared with a traditional GAbased function optimizer. The results are encouraging in two respects. They suggest ways in which the pe ..."
Abstract

Cited by 161 (10 self)
 Add to MetaCart
Abstract. A general model for the coevolution of cooperating species is presented. This model is instantiated and tested in the domain of function optimization, and compared with a traditional GAbased function optimizer. The results are encouraging in two respects. They suggest ways in which the performance of GA and other EAbased optimizers can be improved, and they suggest a new approach to evolving complex structures such as neural networks and rule sets. 1
On the Adaptation of Arbitrary Normal Mutation Distributions in Evolution Strategies: The Generating Set Adaptation
, 1995
"... A new adaptation scheme for adapting arbitrary normal mutation distributions in evolution strategies is introduced. It can adapt correct scaling and correlations between object parameters. Furthermore, it is independent of any rotation of the objective function and reliably adapts mutation dis ..."
Abstract

Cited by 155 (23 self)
 Add to MetaCart
A new adaptation scheme for adapting arbitrary normal mutation distributions in evolution strategies is introduced. It can adapt correct scaling and correlations between object parameters. Furthermore, it is independent of any rotation of the objective function and reliably adapts mutation distributions corresponding to hyperellipsoids with high axis ratio. In simulations, the generating set adaptation is compared to two other schemes which also can produce non axisparallel mutation ellipsoids. It turns out to be the only adaptation scheme which is completely independent of the chosen coordinate system.
Multiobjective Optimization and Multiple Constraint Handling with Evolutionary AlgorithmsPart I: A Unified Formulation
 IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
, 1998
"... In optimization, multiple objectives and constraints cannot be handled independently of the underlying optimizer. Requirements such as continuity and differentiability of the cost surface add yet another conflicting element to the decision process. While ``better'' solutions should be rated higher t ..."
Abstract

Cited by 153 (8 self)
 Add to MetaCart
In optimization, multiple objectives and constraints cannot be handled independently of the underlying optimizer. Requirements such as continuity and differentiability of the cost surface add yet another conflicting element to the decision process. While ``better'' solutions should be rated higher than ``worse'' ones, the resulting cost landscape must also comply with such requirements. Evolutionary algorithms (EAs), which have found application in many areas not amenable to optimization by other methods, possess many characteristics desirable in a multiobjective optimizer, most notably the concerted handling of multiple candidate solutions. However, EAs are essentially unconstrained search techniques which require the assignment of a scalar measure of quality, or fitness, to such candidate solutions. After reviewing current evolutionary approaches to multiobjective and constrained optimization, the paper proposes that fitness assignment be interpreted as, or at least related to, a multicriterion decision process. A suitable decision making framework based on goals and priorities is subsequently formulated in terms of a relational operator, characterized, and shown to encompass a number of simpler decision strategies. Finally, the ranking of an arbitrary number of candidates is considered. The effect of preference changes on the cost surface seen by an EA is illustrated graphically for a simple problem. The paper concludes with the formulation of a multiobjective genetic algorithm based on the proposed decision strategy. Niche formation techniques are used to promote diversity among preferable candidates, and progressive articulation of preferences is shown to be possible as long as the genetic algorithm can recover from abrupt changes in the cost landscape.
SelfAdaptation in Genetic Algorithms
 Proceedings of the First European Conference on Artificial Life
, 1992
"... Within Genetic Algorithms (GAs) the mutation rate is mostly handled as a global, external parameter, which is constant over time or exogeneously changed over time. In this paper a new approach is presented, which transfers a basic idea from Evolution Strategies (ESs) to GAs. Mutation rates are chang ..."
Abstract

Cited by 115 (2 self)
 Add to MetaCart
Within Genetic Algorithms (GAs) the mutation rate is mostly handled as a global, external parameter, which is constant over time or exogeneously changed over time. In this paper a new approach is presented, which transfers a basic idea from Evolution Strategies (ESs) to GAs. Mutation rates are changed into endogeneous items which are adapting during the search process. First experimental results are presented, which indicate that environment dependent selfadaptation of appropriate settings for the mutation rate is possible even for GAs. Furthermore, the reduction of the number of external parameters of a GA is seen as a first step towards achieving a problemdependent selfadaptation of the algorithm. Introduction Natural evolution has proven to be a powerful mechanism for emergence and improvement of the living beings on our planet by performing a randomized search in the space of possible DNAsequences. Due to this knowledge about the qualities of natural evolution, some resea...
Optimal Mutation Rates in Genetic Search
"... The optimization of a single bit string by means of iterated mutation and selection of the best (a (1+1)Genetic Algorithm) is discussed with respect to three simple tness functions: The counting ones problem, a standard binary encoded integer, and a Gray coded integer optimization problem. A mutati ..."
Abstract

Cited by 114 (0 self)
 Add to MetaCart
The optimization of a single bit string by means of iterated mutation and selection of the best (a (1+1)Genetic Algorithm) is discussed with respect to three simple tness functions: The counting ones problem, a standard binary encoded integer, and a Gray coded integer optimization problem. A mutation rate schedule that is optimal with respect to the success probabilityofmutation is presented for each of the objective functions, and it turns out that the standard binary code can hamper the search process even in case of unimodal objective functions. While normally a mutation rate of 1=l (where l denotes the bit string length) is recommendable, our results indicate that a variation of the mutation rate is useful in cases where the tness function is a multimodal pseudoboolean function, where multimodality may be caused by the objective function as well as the encoding mechanism.
Toward the Evolution of Dynamical Neural Networks for Minimally Cognitive Behavior
, 1996
"... Current debates regarding the possible cognitive implications of ideas from adaptive behavior research and dynamical systems theory would benefit greatly from a careful study of simple model agents that exhibit minimally cognitive behavior. This paper sketches one such agent, and presents the result ..."
Abstract

Cited by 106 (9 self)
 Add to MetaCart
Current debates regarding the possible cognitive implications of ideas from adaptive behavior research and dynamical systems theory would benefit greatly from a careful study of simple model agents that exhibit minimally cognitive behavior. This paper sketches one such agent, and presents the results of preliminary experiments on the evolution of dynamical neural networks for visuallyguided orientation, object discrimination and accurate pointing with a simple manipulator to objects appearing in its field of view. 1 Introduction Many of the key ideas emphasized in adaptive behavior research are beginning to have a significant impact on cognitive science. For example, adaptive behavior research in general, and the dynamical perspective on adaptive behavior that is often taken in such research in particular, have begun to significantly influence the growing debates concerning the nature and necessity of notions of representation and computation in explaining cognitive behavio...
The Equation for the Response to Selection and Its Use for Prediction
, 1997
"... The Breeder Genetic Algorithm (BGA) was designed according to the theories and methods used in the science of livestock breeding. The prediction of a breeding experiment is based on the response to selection (RS) equation. This equation relates the change in a population 's fitness to the standard d ..."
Abstract

Cited by 103 (15 self)
 Add to MetaCart
The Breeder Genetic Algorithm (BGA) was designed according to the theories and methods used in the science of livestock breeding. The prediction of a breeding experiment is based on the response to selection (RS) equation. This equation relates the change in a population 's fitness to the standard deviation of its fitness, as well as to the parameters selection intensity and realized heritability. In this paper the exact RS equation is derived for proportionate selection given an infinite population in linkage equilibrium. In linkage equilibrium the genotype frequencies are the product of the univariate marginal frequencies. The equation contains Fisher's fundamental theorem of natural selection as an approximation. The theorem shows that the response is approximately equal to the quotient of a quantity called additive genetic variance, VA , and the average fitness. We compare Mendelian twoparent recombination with genepool recombination, which belongs to a special class of genetic ...
The Science of Breeding and its Application to the Breeder Genetic Algorithm BGA
 EVOLUTIONARY COMPUTATION
, 1994
"... The Breeder Genetic Algorithm BGA models artificial selection as performed by human breeders. The science of breeding is based on advanced statistical methods. In this paper a connection between genetic algorithm theory and the science of breeding is made. We show how the response to selection eq ..."
Abstract

Cited by 100 (23 self)
 Add to MetaCart
The Breeder Genetic Algorithm BGA models artificial selection as performed by human breeders. The science of breeding is based on advanced statistical methods. In this paper a connection between genetic algorithm theory and the science of breeding is made. We show how the response to selection equation and the concept of heritability can be applied to predict the behavior of the BGA. Selection, recombination and mutation are analyzed within this framework. It is shown that recombination and mutation are complementary search operators. The theoretical results are obtained under the assumption of additive gene effects. For general fitness landscapes regression techniques for estimating the heritability are used to analyze and control the BGA. The method of decomposing the genetic variance into an additive and a nonadditive part connects the case of additive fitness functions with the general case.