Results 1  10
of
20
Genetic Algorithms, Noise, and the Sizing of Populations
 COMPLEX SYSTEMS
, 1991
"... This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average sig ..."
Abstract

Cited by 239 (85 self)
 Add to MetaCart
This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average signaltocollateralnoise ratios are favorable to the discrimination of the best building blocks required to solve a problem of bounded deception. The sizing relation is modified to permit the inclusion of other sources of stochasticity, such as the noise of selection, the noise of genetic operators, and the explicit noise or nondeterminism of the objective function. In a test suite of five functions, the sizing relation proves to be a conservative predictor of average correct convergence, as long as all major sources of noise are considered in the sizing calculation. These results suggest how the sizing equation may be viewed as a coarse delineation of a boundary between what a physicist might call two distinct phases of GA behavior. At low population sizes the GA makes many errors of decision, and the quality of convergence is largely left to the vagaries of chance or the serial fixup of flawed results through mutation or other serial injection of diversity. At large population sizes, GAs can reliably discriminate between good and bad building blocks, and parallel processing and recombination of building blocks lead to quick solution of even difficult deceptive problems. Additionally, the paper outlines a number of extensions to this work, including the development of more refined models of the relation between generational average error and ultimate convergence quality, the development of online methods for sizing populations via the estimation of populations...
A Genetic Algorithm Tutorial
 Statistics and Computing
, 1994
"... This tutorial covers the canonical genetic algorithm as well as more experimental forms of genetic algorithms, including parallel island models and parallel cellular genetic algorithms. The tutorial also illustrates genetic search byhyperplane sampling. The theoretical foundations of genetic algorit ..."
Abstract

Cited by 231 (5 self)
 Add to MetaCart
This tutorial covers the canonical genetic algorithm as well as more experimental forms of genetic algorithms, including parallel island models and parallel cellular genetic algorithms. The tutorial also illustrates genetic search byhyperplane sampling. The theoretical foundations of genetic algorithms are reviewed, include the schema theorem as well as recently developed exact models of the canonical genetic algorithm.
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 191 (1 self)
 Add to MetaCart
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
Finding Multimodal Solutions Using Restricted Tournament Selection
 Proceedings of the Sixth International Conference on Genetic Algorithms
, 1995
"... This paper investigates a new technique for the solving of multimodal problems using genetic algorithms (GAs). The proposed technique, Restricted Tournament Selection, is based on the paradigm of local competition. The paper begins by discussing some of the drawbacks of using current multimodal tech ..."
Abstract

Cited by 104 (1 self)
 Add to MetaCart
This paper investigates a new technique for the solving of multimodal problems using genetic algorithms (GAs). The proposed technique, Restricted Tournament Selection, is based on the paradigm of local competition. The paper begins by discussing some of the drawbacks of using current multimodal techniques. The paper then presents the new technique along with an analysis of a class of sets of solutions it preserves and locates. This presentation researches the new technique's restriction on competition from the viewpoint of calculating probability distributions for its tournaments as well as its various niche takeover times. Empirical observations are then presented as evidence of the technique's abilities in a wide variety of settings. Finally, this paper explores the future trajectory of multimodal GA research. Finding Multimodal Solutions Using Restricted Tournament Selection Abstract This paper investigates a new technique for the solving of multimodal problems using genetic algor...
What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation
 Machine Learning
, 1993
"... Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the stru ..."
Abstract

Cited by 100 (3 self)
 Add to MetaCart
Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the structure of a given fitness function when it is expressed as a Walsh polynomial. The work of Bethke, Goldberg, and others has produced certain theoretical results about this relationship. In this article we review these theoretical results, and then discuss a number of seemingly anomalous experimental results reported by Tanese concerning the performance of the GA on a subclass of Walsh polynomials, some members of which were expected to be easy for the GA to optimize. Tanese found that the GA was poor at optimizing all functions in this subclass, that a partitioning of a single large population into a number of smaller independent populations seemed to improve performance, and that hillclimbing outperformed both the original and partitioned forms of the GA on these functions. These results seemed to contradict several commonly held expectations about GAs. We begin by reviewing schema processing in GAs. We then give an informal description of how Walsh analysis and Bethke's Walshschema transform relate to GA performance, and we discuss the relevance of this analysis for GA applications in optimization and machine learning. We then describe Tanese's surprising results, examine them experimentally and theoretically, and propose and evaluate some explanations. These explanations lead to a more fundamental question about GAs: what are the features of problems that determine the likelihood of successful GA performance?
Fundamental Principles of Deception in Genetic Search
 Foundations of Genetic Algorithms
, 1991
"... This paper presents several theorems concerning the nature of deception and the central role that deception plays in function optimization using genetic algorithms. A simple proof is offered which shows that the only problems which pose challenging optimization tasks are problems that involve so ..."
Abstract

Cited by 95 (5 self)
 Add to MetaCart
This paper presents several theorems concerning the nature of deception and the central role that deception plays in function optimization using genetic algorithms. A simple proof is offered which shows that the only problems which pose challenging optimization tasks are problems that involve some degree of deception and which result in conflicting karm bandit competitions between hyperplanes. The concept of a deceptive attractor is introduced and shown to be more general than the deceptive optimum found in the deceptive functions that have been constructed to date. Also introduced are the concepts of fully deceptive problems as well as less strict consistently deceptive problems. A proof is given showing that deceptive attractors must have a complementary bit pattern to that found in the binary representation of the global optimum if a function is to be either fully deceptive or consistently deceptive. Some empirical results are presented which demonstrate different methods of dealing with deception and poor linkage during genetic search.
Learning Linkage
 Foundations of Genetic Algorithms 4
, 1997
"... The topic of linkage has, with a few notable exceptions, been largely ignored. Recent studies have shown this approach to be a profound mistake that GAs ignoring linkage do so at their own computational peril. Inversion, the operator usually called upon to solve this problem, has proven too slow ..."
Abstract

Cited by 63 (8 self)
 Add to MetaCart
The topic of linkage has, with a few notable exceptions, been largely ignored. Recent studies have shown this approach to be a profound mistake that GAs ignoring linkage do so at their own computational peril. Inversion, the operator usually called upon to solve this problem, has proven too slow vis a vis the forces of selection. Inversion is a mutation like operator that acts on chromosomal structures. Where evolution by mutation is too slow and has failed, it remains possible that evolution by pairwise recombination or crossover can be successful. This paper shows that tight linkage can be evolved within the environment of a new crossover operator. 1 Introduction Early studies focused on the concepts of building blocks and linkage as central to understanding the GA (Holland, 1975). Since then, the topic of building blocks has been heavily explored while the topic of linkage has, with a few notable exceptions, been largely ignored (Goldberg, Korb, & Deb, 1989). Recent studies (Thi...
Optimizing an Arbitrary Function is Hard for the Genetic Algorithm
 Proceedings of the Fourth International Conference on Genetic Algorithms
, 1991
"... The Genetic Algorithm (GA) is generally portrayed as a search procedure which can optimize pseudoboolean functions based on a limited sample of the function's values. There have been many attempts to analyze the computational behavior of the GA. For the most part, these attempts have tacitly assume ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
The Genetic Algorithm (GA) is generally portrayed as a search procedure which can optimize pseudoboolean functions based on a limited sample of the function's values. There have been many attempts to analyze the computational behavior of the GA. For the most part, these attempts have tacitly assumed that the algorithmic parameters of the GA (e.g. population size, choice of genetic operators, etc.) can be isolated from the characteristics of the class of functions being optimized. In the following, we demonstrate why this assumption is inappropriate. We consider the class, F, of all deterministic pseudoboolean functions whose values range over the integers. We then consider the Genetic Algorithm as a combinatorial optimization problem over f0; 1g l and demonstrate that the computational problem it attempts to solve is NPhard relative to this class of functions. Using standard performance measures, we also give evidence that the Genetic Algorithm will not be able to efficiently appr...
Gene Expression and Fast Construction of Distributed Evolutionary Representation
 Evolutionary Computation
, 2001
"... The gene expression process in nature produces different proteins in different cells from different portions of the DNA. Since proteins control almost every important activity in a living organism, at an abstract level, gene expression can be viewed as a process that evaluates the merit or "fitness" ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
The gene expression process in nature produces different proteins in different cells from different portions of the DNA. Since proteins control almost every important activity in a living organism, at an abstract level, gene expression can be viewed as a process that evaluates the merit or "fitness" of the DNA. This distributed evaluation of the DNA would not be possible without a decomposed representation of the fitness function defined over the DNAs. This paper argues that, unless the living body was provided with such a representation, we have every reason to believe that it must have an efficient mechanism to construct this distributed representation. This paper demonstrates polynomialtime computability of such a representation by proposing a class of efficient algorithms. The main contribution of this paper is twofold. On the algorithmic side, it offers a way to scale up evolutionary search by detecting the underlying structure of the search space. On the biological side, it proves that the distributed representation of the evolutionary fitness function in gene expression can be computed in polynomialtime.