Results 1  10
of
11
The Advantages of Evolutionary Computation
, 1997
"... Evolutionary computation is becoming common in the solution of difficult, realworld problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific ..."
Abstract

Cited by 398 (5 self)
 Add to MetaCart
Evolutionary computation is becoming common in the solution of difficult, realworld problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as the ability to selfadapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine. 1 Introduction Darwinian evolution is intrinsically a robust search and optimization mechanism. Evolved biota demonstrate optimized complex behavior at every level: the cell, the organ, the individual, and the population. The problems that biological species have solved are typified by chaos, chance, temporality, and nonlinear interactivities. These are also characteristics of problems that have proved to be especially intractable to classic methods of o...
General Schema Theory for Genetic Programming with SubtreeSwapping Crossover
 In Genetic Programming, Proceedings of EuroGP 2001, LNCS
, 2001
"... In this paper a new, general and exact schema theory for genetic programming is presented. The theory includes a microscopic schema theorem applicable to crossover operators which replace a subtree in one parent with a subtree from the other parent to produce the offspring. A more macroscopic schema ..."
Abstract

Cited by 45 (28 self)
 Add to MetaCart
In this paper a new, general and exact schema theory for genetic programming is presented. The theory includes a microscopic schema theorem applicable to crossover operators which replace a subtree in one parent with a subtree from the other parent to produce the offspring. A more macroscopic schema theorem is also provided which is valid for crossover operators in which the probability of selecting any two crossover points in the parents depends only on their size and shape. The theory is based on the notions of Cartesian node reference systems and variablearity hyperschemata both introduced here for the first time. In the paper we provide examples which show how the theory can be specialised to specific crossover operators and how it can be used to derive an exact definition of effective fitness and a sizeevolution equation for GP. 1
Exact Schema Theory for Genetic Programming and Variablelength Genetic Algorithms with OnePoint Crossover
, 2001
"... A few schema theorems for Genetic Programming (GP) have been proposed in the literature in the last few years. Since they consider schema survival and disruption only, they can only provide a lower bound for the expected value of the number of instances of a given schema at the next generation rathe ..."
Abstract

Cited by 30 (16 self)
 Add to MetaCart
A few schema theorems for Genetic Programming (GP) have been proposed in the literature in the last few years. Since they consider schema survival and disruption only, they can only provide a lower bound for the expected value of the number of instances of a given schema at the next generation rather than an exact value. This paper presents theoretical results for GP with onepoint crossover which overcome this problem. Firstly, we give an exact formulation for the expected number of instances of a schema at the next generation in terms of microscopic quantities. Thanks to this formulation we are then able to provide an improved version of an earlier GP schema theorem in which some (but not all) schema creation events are accounted for. Then, we extend this result to obtain an exact formulation in terms of macroscopic quantities which makes all the mechanisms of schema creation explicit. This theorem allows the exact formulation of the notion of effective fitness in GP and opens the way to future work on GP convergence, population sizing, operator biases, and bloat, to mention only some of the possibilities.
Hyperschema Theory for GP with OnePoint Crossover, Building Blocks, and Some New Results in GA Theory
 Genetic Programming, Proceedings of EuroGP 2000
, 2000
"... Two main weaknesses of GA and GP schema theorems axe that they provide only information on the expected value of the number of instances of a given schema at the next generation E[m(H,t + 1)], and they can only give a lower bound for such a quantity. This paper presents new theoretical results o ..."
Abstract

Cited by 23 (17 self)
 Add to MetaCart
Two main weaknesses of GA and GP schema theorems axe that they provide only information on the expected value of the number of instances of a given schema at the next generation E[m(H,t + 1)], and they can only give a lower bound for such a quantity. This paper presents new theoretical results on GP and GA schemata which laxgely overcome these weaknesses. Firsfly, unlike previous results which concentrated on schema survival and disruption, our results extend to GP recent work on GA theory by Stephens and Waelbroeck, and make the effects and the mechanisms of schema creation explicit. This allows us to give an exact formulation (rather than a lower bound) for the expected number of instances of a schema at the next generation. Thanks to this formulation we are then able to provide in improved version for an eaxlier GP schema theorem in which some schema creation events axe accounted for, thus obtaining a tighter bound for E[m(H, t + 1)]. This bound is a function of the selection probabilities of the schema itself and of a set of lowerorder schemata which onepoint crossover uses to build instances of the schema. This result supports the existence of building blocks in GP which, however, axe not necessaxily all short, loworder or highly fit. Building on eaxlier work, we show how Stephens and Waelbroeck 's GA results and the new GP results described in the paper can be used to evaluate schema vaxiance, signaltonoise ratio and, in general, the probability distribution of re(H, t + 1). In addition, we show how the expectation operator can be removed from the schema theorem so as to predict with a known probability whether re(H, t + 1) (rather than Elm(H, t + 1)]) is going to be above a given threshold.
Recursive Conditional Schema Theorem, Convergence and Population Sizing in Genetic Algorithms
 Proceedings of the Foundations of Genetic Algorithms Workshop (FOGA 6
, 2000
"... In this paper we start by presenting two forms of schema theorem in which expectations are not present. These theorems allow one to predict with a known probability whether the number of instances of a schema at the next generation will be above a given threshold. Then we clarify that in the presenc ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
In this paper we start by presenting two forms of schema theorem in which expectations are not present. These theorems allow one to predict with a known probability whether the number of instances of a schema at the next generation will be above a given threshold. Then we clarify that in the presence of stochasticity schema theorems should be interpreted as conditional statements and we use a conditional version of schema theorem backwards to predict the past from the future. Assuming that at least x instances of a schema are present in one generation, this allows us to find the conditions (at the previous generation) under which such x instances will indeed be present with a given probability. This suggests a possible strategy to study GA convergence based on schemata. We use this strategy to obtain a recursive version of the schema theorem. Among other uses, this schema theorem allows one to find under which conditions on the initial generation a GA will converge to a solut...
Probabilistic Schema Theorems without Expectation, Recursive Conditional Schema Theorem, Convergence and Population Sizing in Genetic Algorithms
, 1999
"... In this paper we first develop a new form of schema theorem in which expectations are not present. This theorem allows one to predict with a known probability whether the number of instances of a schema at the next generation will be above a given threshold. Then we use this version of the schema ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
In this paper we first develop a new form of schema theorem in which expectations are not present. This theorem allows one to predict with a known probability whether the number of instances of a schema at the next generation will be above a given threshold. Then we use this version of the schema theorem backwards, i.e. to predict the past from the future. Assuming that at least one solution is found at one generation, this allows us to find the conditions (at the previous generation) under which such a solution will indeed be found with a given probability. This allows us to obtain a recursive version of the schema theorem. This schema theorem allows one to find under which conditions on the initial generation the GA will converge to a solution on the hypothesis that building block and population fitnesses are known. These results are important because for the first time they make explicit the relation between population size, schema fitness and probability of convergence ...
Why the Schema Theorem is Correct also in the Presence of Stochastic Effects
, 2000
"... Holland's schema theorem has been criticised in (Fogel and Ghozeil 1997, Fogel and Ghozeil 1998, Fogel and Ghozeil 1999) for not being able to estimate correctly the expected proportion of a schema in the population when fitness proportionate selection is used in the presence of noise or other stoch ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Holland's schema theorem has been criticised in (Fogel and Ghozeil 1997, Fogel and Ghozeil 1998, Fogel and Ghozeil 1999) for not being able to estimate correctly the expected proportion of a schema in the population when fitness proportionate selection is used in the presence of noise or other stochastic effects. This is incorrect for two reasons. Firstly, the theorem in its original form is not applicable to this case. As clarified in the paper, if the quantities involved in schema theorems are random variables, the theorems must be interpreted as conditional statements. Secondly, the conditional versions of Holland and other researchers' schema theorems are indeed very useful to model the sampling of schemata in the presence of stochasticity. In the paper I show how one can calculate the correct expected proportion of a schema in the presence of stochastic effects when selection only is present, using a conditional interpretation of Holland's schema theorem. In addition, I generalise this result (again using schema theorems) to the case in which crossover, mutation, and selection with replacement are used. This can be considered as an exact schema theorem applicable both in the presence and in the absence of stochastic effects.
New Results in the Schema Theory for GP with OnePoint Crossover which Account for Schema Creation, Survival and Disruption
, 1999
"... Two main weaknesses of GA and GP schema theorems are that they provide only information on the expected value of the number of instances of a given schema at the next generation E[m(H; t + 1)], and they can only give a lower bound for such a quantity. This paper presents new theoretical results o ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Two main weaknesses of GA and GP schema theorems are that they provide only information on the expected value of the number of instances of a given schema at the next generation E[m(H; t + 1)], and they can only give a lower bound for such a quantity. This paper presents new theoretical results on GP and GA schemata which largely overcome these weaknesses. Firstly, unlike previous results which concentrated on schema survival and disruption, our results extend to GP recent work on GA theory by Stephens and Waelbroeck, and make the effects and the mechanisms of schema creation explicit. This allows us to give an exact formulation (rather than a lower bound) for the expected number of instances of a schema at the next generation. Thanks to this formulation we are then able to provide in improved version for an earlier GP schema theorem in which some schema creation events are accounted for, thus obtaining a tighter boundfor E[m(H; t + 1)]. This bound is a function of the sele...
Foundations of evolutionary computation
 Proceedings of the SPIE, Volume 6228
, 2006
"... Evolutionary computation is a rapidly expanding field of research with a long history. Much of that history remains unknown to most practitioners and researchers. This paper offers a review of selected foundational efforts in evolutionary computation. A brief initial overview of the essential compon ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Evolutionary computation is a rapidly expanding field of research with a long history. Much of that history remains unknown to most practitioners and researchers. This paper offers a review of selected foundational efforts in evolutionary computation. A brief initial overview of the essential components of evolutionary algorithms is presented, followed by a review of early research in artificial life, evolving programs, and evolvable hardware. Comments on theoretical developments and future developments conclude the review.
A Methodology for the Statistical Characterization of Genetic Algorithms
"... Abstract. The inherent complexity of the Genetic Algorithms (GAs) has led to various theoretical an experimental approaches whose ultimate goal is to better understand the dynamics of such algorithms. Through such understanding, it is hoped, we will be able to improve their efficiency. Experiments, ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. The inherent complexity of the Genetic Algorithms (GAs) has led to various theoretical an experimental approaches whose ultimate goal is to better understand the dynamics of such algorithms. Through such understanding, it is hoped, we will be able to improve their efficiency. Experiments, typically, explore the GA’s behavior by testing them versus a set of functions with characteristics deemed adequate. In this paper we present a methodology which aims at achieving a solid relative evaluation of alternative GAs by resorting to statistical arguments. With it we may categorize any iterative optimization algorithm by statistically finding the basic parameters of the probability distribution of the GA’s optimum values without resorting to a priori functions. We analyze the behavior of 6 algorithms (5 variations of a GA and a hill climber) which we characterize and compare. We make some remarks regarding the relation between statistical studies such as ours and the well known “No Free Lunch Theorem”. 1