Results 1  10
of
28
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 191 (1 self)
 Add to MetaCart
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation
 Machine Learning
, 1993
"... Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the stru ..."
Abstract

Cited by 100 (3 self)
 Add to MetaCart
Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the structure of a given fitness function when it is expressed as a Walsh polynomial. The work of Bethke, Goldberg, and others has produced certain theoretical results about this relationship. In this article we review these theoretical results, and then discuss a number of seemingly anomalous experimental results reported by Tanese concerning the performance of the GA on a subclass of Walsh polynomials, some members of which were expected to be easy for the GA to optimize. Tanese found that the GA was poor at optimizing all functions in this subclass, that a partitioning of a single large population into a number of smaller independent populations seemed to improve performance, and that hillclimbing outperformed both the original and partitioned forms of the GA on these functions. These results seemed to contradict several commonly held expectations about GAs. We begin by reviewing schema processing in GAs. We then give an informal description of how Walsh analysis and Bethke's Walshschema transform relate to GA performance, and we discuss the relevance of this analysis for GA applications in optimization and machine learning. We then describe Tanese's surprising results, examine them experimentally and theoretically, and propose and evaluate some explanations. These explanations lead to a more fundamental question about GAs: what are the features of problems that determine the likelihood of successful GA performance?
Realcoded Genetic Algorithms, Virtual Alphabets, and Blocking
 Complex Systems
, 1990
"... This paper presents a theory of convergence for realcoded genetic algorithmsGAs that use floatingpoint or other highcardinality codings in their chromosomes. The theory is consistent with the theory of schemata and postulates that selection dominates early GA performance and restricts subseque ..."
Abstract

Cited by 78 (7 self)
 Add to MetaCart
This paper presents a theory of convergence for realcoded genetic algorithmsGAs that use floatingpoint or other highcardinality codings in their chromosomes. The theory is consistent with the theory of schemata and postulates that selection dominates early GA performance and restricts subsequent search to intervals with aboveaverage function value, dimension by dimension. These intervals may be further subdivided on the basis of their attraction under genetic hillclimbing. Each of these subintervals is called a virtual character, and the collection of characters along a given dimension is called a virtual alphabet. It is the virtual alphabet that is searched during the recombinative phase of the genetic algorithm, and in many problems this is sufficient to ensure that good solutions are found. Although the theory helps suggest why many problems have been solved using realcoded GAs, it also suggests that realcoded GAs can be blocked from further progress in those situations whe...
Deception Considered Harmful
 Foundations of Genetic Algorithms 2
, 1992
"... A central problem in the theory of genetic algorithms is the characterization of problems that are difficult for GAs to optimize. Many attempts to characterize such problems focus on the notion of Deception, defined in terms of the static average fitness of competing schemas. This article examines t ..."
Abstract

Cited by 73 (0 self)
 Add to MetaCart
A central problem in the theory of genetic algorithms is the characterization of problems that are difficult for GAs to optimize. Many attempts to characterize such problems focus on the notion of Deception, defined in terms of the static average fitness of competing schemas. This article examines the Static Building Block Hypothesis (SBBH), the underlying assumption used to define Deception. Exploiting contradictions between the SBBH and the Schema Theorem, we show that Deception is neither necessary nor sufficient for problems to be difficult for GAs. This article argues that the characterization of hard problems must take into account the basic features of genetic algorithms, especially their dynamic, biased sampling strategy. Keywords: Deception, building block hypothesis 1 INTRODUCTION Since Holland's early work on the analysis of genetic algorithms (GAs), the usual approach has been to focus on the allocation of search effort to subspaces described by schemas representing hyper...
Genetic Algorithm Difficulty and the Modality of Fitness Landscapes
 Foundations of Genetic Algorithms 3
, 1994
"... We assume that the modality (i.e., number of local optima) of a fitness landscape is related to the difficulty of finding the best point on that landscape by evolutionary computation (e.g., hillclimbers and genetic algorithms (GAs)). We first examine the limits of modality by constructing a unimodal ..."
Abstract

Cited by 56 (2 self)
 Add to MetaCart
We assume that the modality (i.e., number of local optima) of a fitness landscape is related to the difficulty of finding the best point on that landscape by evolutionary computation (e.g., hillclimbers and genetic algorithms (GAs)). We first examine the limits of modality by constructing a unimodal function and a maximally multimodal function. At such extremes our intuition breaks down. A fitness landscape consisting entirely of a single hill leading to the global optimum proves to be hard for hillclimbers but apparently easy for GAs. A provably maximally multimodal function, in which half the points in the search space are local optima, can be easy for both hillclimbers and GAs. Exploring the more realistic intermediate range between the extremes of modality, we construct local optima with varying degrees of "attraction" to our evolutionary algorithms. Most work on optima and their basins of attraction has focused on hills and hillclimbers, while some research has explored attraction...
Genetic Algorithms and the Variance of Fitness
, 1991
"... this paper, we consider one important source of stochastic variation, the variance of a schema's fitness or what we call collateral noise. Specifically, a method for calculating fitness variance from a function's Walsh transform is derived and applied to a number of problems in GA analysis. In the r ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
this paper, we consider one important source of stochastic variation, the variance of a schema's fitness or what we call collateral noise. Specifically, a method for calculating fitness variance from a function's Walsh transform is derived and applied to a number of problems in GA analysis. In the remainder, Walsh functions and their application to the calculation of schema average fitness are reviewed; a formula for the calculation of schema fitness variance is derived using Walsh transforms. The variance computation is then applied to two important problems in genetic algorithm theory: population sizing and the calculation of rigorous probabilistic convergence bounds. Extending the technique to the analysis of nonuniform populations is also discussed. Re iew of als  c ema Anal sis
Breeding hybrid strategies: Optimal behaviour for oligopolists
 Journal of Evolutionary Economics
, 1992
"... Abstract. Oligopolistic pricing decisions in which the choice variable is not dichotomous as in the simple prisoner's dilemma but continuous have been modeled as a generalized prisoner's dilemma (GPD) by Fader and Hauser, who sought, in the two MIT Computer Strategy Tournaments, to obtain an effec ..."
Abstract

Cited by 32 (9 self)
 Add to MetaCart
Abstract. Oligopolistic pricing decisions in which the choice variable is not dichotomous as in the simple prisoner's dilemma but continuous have been modeled as a generalized prisoner's dilemma (GPD) by Fader and Hauser, who sought, in the two MIT Computer Strategy Tournaments, to obtain an effective generalization of Rapoport's Tit for Tat for the threeperson repeated game. Holland's genetic algorithm and Axelrod's representation of contingent strategies provide a means of generating new strategies in the computer, through machine learning, without outside submissions. The paper discusses how findings from twoperson tournaments can be extended to the GPD, in particular how the author's winning strategy in the Second MIT Competitive Strategy Tournament could be bettered. The paper provides insight into how oligopolistic pricing competitors can successfully compete, and underlines the importance of "niche " strategies, successful against a particular environment of competitors. Bootstrapping, or breeding strategies against their peers, provides a means of
The nature of niching: genetic algorithms and the evolution of optimal, cooperative populations
, 1997
"... ..."
Conditions for Implicit Parallelism
 Foundations of Genetic Algorithms
, 1991
"... Many interesting varieties of genetic algorithms have been designed and implemented in the last fifteen years. One way to improve our understanding of genetic algorithms is to identify properties that are invariant across these seemingly different versions. This paper focuses on invariants among gen ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
Many interesting varieties of genetic algorithms have been designed and implemented in the last fifteen years. One way to improve our understanding of genetic algorithms is to identify properties that are invariant across these seemingly different versions. This paper focuses on invariants among genetic algorithms that differ along two dimensions: (1) the way userdefined objective function is mapped to a fitness measure, and (2) the way the fitness measure is used to assign offspring to parents. A genetic algorithm is called admissible if it meets what seem to be the weakest reasonable requirements along these dimensions. It is shown that any admissible genetic algorithm exhibits a form of implicit parallelism. Keywords: Implicit parallelism, invariants, karmed bandits 1 Introduction Whenever a new variation of genetic algorithm is proposed, it is reasonable to expect that the designer will provide an analysis of how the new algorithm compares with previous algorithms. Often this...
Using Neural Networks and Genetic Algorithms as Heuristics for NPComplete Problems
, 1983
"... Paradigms for using neural networks (NNs) and genetic algorithms (GAs) to heuristically solve boolean satisfiability (SAT) problems are presented. Since SAT is NPComplete, any other NPComplete problem can be transformed into an equivalent SAT problem in polynomial time, and solved via either parad ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
Paradigms for using neural networks (NNs) and genetic algorithms (GAs) to heuristically solve boolean satisfiability (SAT) problems are presented. Since SAT is NPComplete, any other NPComplete problem can be transformed into an equivalent SAT problem in polynomial time, and solved via either paradigm. This technique is illustrated for hamiltonian circuit (HC) problems. INTRODUCTION NPComplete problems are problems that are not currently solvable in polynomial time. However, they are polynomially equivalent in the sense that any NPComplete problem can be transformed into any other in polynomial time. Thus, if any NPComplete problem can be solved in polynomial time, they all can [Garey]. The canonical example of an NPComplete problem is the boolean satisfiability (SAT) problem: Given an arbitrary boolean expression of n variables, does there exist an assignment to those variables such that the expression is true? Other familiar examples include job shop scheduling, bin packing, a...