Results 1  10
of
52
PopulationBased Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
, 1994
"... Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within th ..."
Abstract

Cited by 298 (11 self)
 Add to MetaCart
Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within the framework of competitive learning. This new perspective reveals a number of different possibilities for performance improvements. This paper explores populationbased incremental learning (PBIL), a method of combining the mechanisms of a generational genetic algorithm with simple competitive learning. The combination of these two methods reveals a tool which is far simpler than a GA, and which outperforms a GA on large set of optimization problems in terms of both speed and accuracy. This paper presents an empirical analysis of where the proposed technique will outperform genetic algorithms, and describes a class of problems in which a genetic algorithm may be able to perform better. Extensions to this algorithm are discussed and analyzed. PBIL and extensions are compared with a standard GA on twelve problems, including standard numerical optimization functions, traditional GA test suite problems, and NPComplete problems.
Evolutionary computation: Comments on the history and current state
 IEEE Transactions on Evolutionary Computation
, 1997
"... Abstract — Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general struc ..."
Abstract

Cited by 207 (0 self)
 Add to MetaCart
Abstract — Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and the working principles of different approaches, including genetic algorithms (GA) [with links to genetic programming (GP) and classifier systems (CS)], evolution strategies (ES), and evolutionary programming (EP) by analysis and comparison of their most important constituents (i.e., representations, variation operators, reproduction, and selection mechanism). Finally, we give a brief overview on the manifold of application domains, although this necessarily must remain incomplete. Index Terms — Classifier systems, evolution strategies, evolutionary computation, evolutionary programming, genetic algorithms,
Removing the Genetics from the Standard Genetic Algorithm
, 1995
"... We present an abstraction of the genetic algorithm (GA), termed populationbased incremental learning (PBIL), that explicitly maintains the statistics contained in a GA's population, but which abstracts away the crossover operator and redefines the role of the population. This results in PBIL being ..."
Abstract

Cited by 178 (10 self)
 Add to MetaCart
We present an abstraction of the genetic algorithm (GA), termed populationbased incremental learning (PBIL), that explicitly maintains the statistics contained in a GA's population, but which abstracts away the crossover operator and redefines the role of the population. This results in PBIL being simpler, both computationally and theoretically, than the GA. Empirical results reported elsewhere show that PBIL is faster and more effective than the GA on a large set of commonly used benchmark problems. Here we present results on a problem custom designed to benefit both from the GA's crossover operator and from its use of a population. The results show that PBIL performs as well as, or better than, GAs carefully tuned to do well on this problem. This suggests that even on problems custom designed for GAs, much of the power of the GA may derive from the statistics maintained implicitly in its population, and not from the population itself nor from the crossover operator. Removing the Ge...
The Equation for the Response to Selection and Its Use for Prediction
, 1997
"... The Breeder Genetic Algorithm (BGA) was designed according to the theories and methods used in the science of livestock breeding. The prediction of a breeding experiment is based on the response to selection (RS) equation. This equation relates the change in a population 's fitness to the standard d ..."
Abstract

Cited by 103 (15 self)
 Add to MetaCart
The Breeder Genetic Algorithm (BGA) was designed according to the theories and methods used in the science of livestock breeding. The prediction of a breeding experiment is based on the response to selection (RS) equation. This equation relates the change in a population 's fitness to the standard deviation of its fitness, as well as to the parameters selection intensity and realized heritability. In this paper the exact RS equation is derived for proportionate selection given an infinite population in linkage equilibrium. In linkage equilibrium the genotype frequencies are the product of the univariate marginal frequencies. The equation contains Fisher's fundamental theorem of natural selection as an approximation. The theorem shows that the response is approximately equal to the quotient of a quantity called additive genetic variance, VA , and the average fitness. We compare Mendelian twoparent recombination with genepool recombination, which belongs to a special class of genetic ...
An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics
, 1995
"... This report is a repository for the results obtained from a large scale empirical comparison of seven iterative and evolutionbased optimization heuristics. Twentyseven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
This report is a repository for the results obtained from a large scale empirical comparison of seven iterative and evolutionbased optimization heuristics. Twentyseven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include jobshop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2^368 to 2^2040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.
Diversity in Genetic Programming: An Analysis of Measures and Correlation with Fitness
, 2004
"... This paper examines measures of diversity in genetic programming. The goal is to understand the importance of such measures and their relationship with fitness. Diversity methods and measures from the literature are surveyed and a selected set of measures are applied to common standard problem insta ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
This paper examines measures of diversity in genetic programming. The goal is to understand the importance of such measures and their relationship with fitness. Diversity methods and measures from the literature are surveyed and a selected set of measures are applied to common standard problem instances in an experimental study. Results show the varying definitions and behaviours of diversity and the varying correlation between diversity and fitness during different stages of the evolutionary process. Populations in the genetic programming algorithm are shown to become structurally similar while maintaining a high amount of behavioural differences. Conclusions describe what measures are likely to be important for understanding and improving the search process and why diversity might have different meaning for different problem domains.
Gene Pool Recombination in Genetic Algorithms
 Metaheuristics: Theory and Applications
, 1995
"... : A new recombination operator, called Gene Pool Recombination (GPR) is introduced. In GPR, the genes are randomly picked from the gene pool defined by the selected parents. The mathematical analysis of GPR is easier than for twoparent recombination (TPR) normally used in genetic algorithms. Ther ..."
Abstract

Cited by 40 (10 self)
 Add to MetaCart
: A new recombination operator, called Gene Pool Recombination (GPR) is introduced. In GPR, the genes are randomly picked from the gene pool defined by the selected parents. The mathematical analysis of GPR is easier than for twoparent recombination (TPR) normally used in genetic algorithms. There are n difference equations for the marginal gene frequencies that describe the evolution of a population for a fitness function of size n. For simple fitness functions TPR and GPR perform similarly, with a slight advantage for GPR. Furthermore the mathematical analysis shows that a genetic algorithm with only selection and recombination is not a global optimization method, in contrast to popular belief. Keywords: Difference equations, genetic algorithms, HardyWeinberg equilibrium, recombination. 1. Introduction Genetic algorithms (GAs) use at least three different components for guiding the search to an optimum  selection, mutation and recombination. Understanding the evolution of g...
Causality in Genetic Programming
 Genetic Algorithms: Proceedings of the Sixth International Conference (ICGA95
, 1995
"... Causality relates changes in the structure of an object with the effects of such changes, that is changes in the properties or behavior of the object. This paper analyzes the concept of causality in Genetic Programming (GP) and suggests how it can be used in adapting control parameters for speeding ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
Causality relates changes in the structure of an object with the effects of such changes, that is changes in the properties or behavior of the object. This paper analyzes the concept of causality in Genetic Programming (GP) and suggests how it can be used in adapting control parameters for speeding up GP search. We first analyze the effects of crossover to show the weak causality of the GP representation and operators. Hierarchical GP approaches based on the discovery and evolution of functions amplify this phenomenon. However, selection gradually retains strongly causal changes. Causality is correlated to search space exploitation and is discussed in the context of the explorationexploitation tradeoff. The results described argue for a bottomup GP evolutionary thesis. Finally, new developments based on the idea of GP architecture evolution (Koza, 1994a) are discussed from the causality perspective. Proceedings of the Fifth International Conference (ICGA95) Morgan Kaufmann, San Franc...
Hierarchical Learning with Procedural Abstraction Mechanisms
, 1997
"... Evolutionary computation (EC) consists of the design and analysis of probabilistic algorithms inspired by the principles of natural selection and variation. Genetic Programming (GP) is one subfield of EC that emphasizes desirable features such as the use of procedural representations, the capability ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
Evolutionary computation (EC) consists of the design and analysis of probabilistic algorithms inspired by the principles of natural selection and variation. Genetic Programming (GP) is one subfield of EC that emphasizes desirable features such as the use of procedural representations, the capability to discover and exploit intrinsic characteristics of the application domain, and the flexibility to adapt the shape and complexity of learned models. Approaches that learn monolithic representations are considerably less likely to be effective for complex problems, and standard GP is no exception. The main goal of this dissertation is to extend GP capabilities with automatic mechanisms to cope with problems of increasing complexity. Humans succeed here by skillfully using hierarchical decomposition and abstraction mechanisms. The translation of such mechanisms into a general computer implementation is a tremendous challenge, which requires a firm understanding of the interplay between repr...
Genetic Algorithms
, 1997
"... this paper. Bremermann's algorithm contained most of the ingredients of a good evolutionary algorithm. But because of limited computer experiments and a missing theory, he did not find a good combination of the ingredients. In the 70's two different evolutionary algorithms independently emerged  t ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
this paper. Bremermann's algorithm contained most of the ingredients of a good evolutionary algorithm. But because of limited computer experiments and a missing theory, he did not find a good combination of the ingredients. In the 70's two different evolutionary algorithms independently emerged  the genetic algorithm GA of Holland [1975] and the evolution strategies of Rechenberg [1973] and Schwefel [1981] . Holland was not so much interested in optimization, but in adaptation. He investigated the genetic algorithm with decision theory for discrete domains. Holland emphasized the importance of recombination in large populations, whereas Rechenberg and Schwefel mainly investigated mutation in very small populations for continuous parameter optimization.