Results 1  10
of
50
PopulationBased Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
, 1994
"... Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within th ..."
Abstract

Cited by 298 (11 self)
 Add to MetaCart
Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within the framework of competitive learning. This new perspective reveals a number of different possibilities for performance improvements. This paper explores populationbased incremental learning (PBIL), a method of combining the mechanisms of a generational genetic algorithm with simple competitive learning. The combination of these two methods reveals a tool which is far simpler than a GA, and which outperforms a GA on large set of optimization problems in terms of both speed and accuracy. This paper presents an empirical analysis of where the proposed technique will outperform genetic algorithms, and describes a class of problems in which a genetic algorithm may be able to perform better. Extensions to this algorithm are discussed and analyzed. PBIL and extensions are compared with a standard GA on twelve problems, including standard numerical optimization functions, traditional GA test suite problems, and NPComplete problems.
Genetic Algorithms, Noise, and the Sizing of Populations
 COMPLEX SYSTEMS
, 1991
"... This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average sig ..."
Abstract

Cited by 239 (85 self)
 Add to MetaCart
This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average signaltocollateralnoise ratios are favorable to the discrimination of the best building blocks required to solve a problem of bounded deception. The sizing relation is modified to permit the inclusion of other sources of stochasticity, such as the noise of selection, the noise of genetic operators, and the explicit noise or nondeterminism of the objective function. In a test suite of five functions, the sizing relation proves to be a conservative predictor of average correct convergence, as long as all major sources of noise are considered in the sizing calculation. These results suggest how the sizing equation may be viewed as a coarse delineation of a boundary between what a physicist might call two distinct phases of GA behavior. At low population sizes the GA makes many errors of decision, and the quality of convergence is largely left to the vagaries of chance or the serial fixup of flawed results through mutation or other serial injection of diversity. At large population sizes, GAs can reliably discriminate between good and bad building blocks, and parallel processing and recombination of building blocks lead to quick solution of even difficult deceptive problems. Additionally, the paper outlines a number of extensions to this work, including the development of more refined models of the relation between generational average error and ultimate convergence quality, the development of online methods for sizing populations via the estimation of populations...
Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms
 Proceedings of the Sixth International Conference on Genetic Algorithms
, 1995
"... A measure of search difficulty, fitness distance correlation (FDC), is introduced and examined in relation to genetic algorithm (GA) performance. In many cases, this correlation can be used to predict the performance of a GA on problems with known global maxima. It correctly classifies easy deceptiv ..."
Abstract

Cited by 204 (5 self)
 Add to MetaCart
A measure of search difficulty, fitness distance correlation (FDC), is introduced and examined in relation to genetic algorithm (GA) performance. In many cases, this correlation can be used to predict the performance of a GA on problems with known global maxima. It correctly classifies easy deceptive problems as easy and difficult nondeceptive problems as difficult, indicates when Gray coding will prove better than binary coding, and is consistent with the surprises encountered when GAs were used on the Tanese and royal road functions. The FDC measure is a consequence of an investigation into the connection between GAs and heuristic search. 1 INTRODUCTION A correspondence between evolutionary algorithms and heuristic state space search is developed in (Jones, 1995b). This is based on a model of fitness landscapes as directed, labeled graphs that are closely related to the state spaces employed in heuristic search. We examine one aspect of this correspondence, the relationship between...
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 191 (1 self)
 Add to MetaCart
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
Tackling RealCoded Genetic Algorithms: Operators and Tools for Behavioural Analysis
 Artificial Intelligence Review
, 1998
"... . Genetic algorithms play a significant role, as search techniques for handling complex spaces, in many fields such as artificial intelligence, engineering, robotic, etc. Genetic algorithms are based on the underlying genetic process in biological organisms and on the natural evolution principles of ..."
Abstract

Cited by 123 (24 self)
 Add to MetaCart
. Genetic algorithms play a significant role, as search techniques for handling complex spaces, in many fields such as artificial intelligence, engineering, robotic, etc. Genetic algorithms are based on the underlying genetic process in biological organisms and on the natural evolution principles of populations. These algorithms process a population of chromosomes, which represent search space solutions, with three operations: selection, crossover and mutation. Under its initial formulation, the search space solutions are coded using the binary alphabet. However, the good properties related with these algorithms do not stem from the use of this alphabet; other coding types have been considered for the representation issue, such as real coding, which would seem particularly natural when tackling optimization problems of parameters with variables in continuous domains. In this paper we review the features of realcoded genetic algorithms. Different models of genetic operators and some me...
Massive Multimodality, Deception, and Genetic Algorithms
, 1992
"... This paper considers the use of genetic algorithms (GAs) for the solution of problems that are both averagesense misleading (deceptive) and massively multimodal. An archetypical multimodaldeceptive problem, here called a bipolar deceptive problem, is defined and two generalized constructions of su ..."
Abstract

Cited by 113 (25 self)
 Add to MetaCart
This paper considers the use of genetic algorithms (GAs) for the solution of problems that are both averagesense misleading (deceptive) and massively multimodal. An archetypical multimodaldeceptive problem, here called a bipolar deceptive problem, is defined and two generalized constructions of such problems are reviewed, one using reflected trap functions and one using loworder Walsh coefficients; sufficient conditions for bipolar deception are also reviewed. The Walsh construction is then used to form a 30bit, ordersix bipolardeceptive function by concatenating five, sixbit bipolar functions. This test function, with over five million local optima and 32 global optima, poses a difficult challenge to simple and niched GAs alike. Nonetheless, simulations show that a simple GA can reliably find one of the 32 global optima if appropriate signaltonoiseratio population sizing is adopted. Simulations also demonstrate that a niched GA can reliably and simultaneously find all 32 global solutions if the population is roughly sized for the expected niche distribution and if the function is appropriately scaled to emphasize global solutions at the expense of suboptimal ones. These results immediately recommend the application of niched GAs using appropriate population sizing and scaling. They also suggest a number of avenues for generalizing the notion of deception.
Theoretical and Numerical ConstraintHandling Techniques used with Evolutionary Algorithms: A Survey of the State of the Art
, 2002
"... This paper provides a comprehensive survey of the most popular constrainthandling techniques currently used with evolutionary algorithms. We review approaches that go from simple variations of a penalty function, to others, more sophisticated, that are biologically inspired on emulations of the imm ..."
Abstract

Cited by 100 (21 self)
 Add to MetaCart
This paper provides a comprehensive survey of the most popular constrainthandling techniques currently used with evolutionary algorithms. We review approaches that go from simple variations of a penalty function, to others, more sophisticated, that are biologically inspired on emulations of the immune system, culture or ant colonies. Besides describing briefly each of these approaches (or groups of techniques), we provide some criticism regarding their highlights and drawbacks. A small comparative study is also conducted, in order to assess the performance of several penaltybased approaches with respect to a dominancebased technique proposed by the author, and with respect to some mathematical programming approaches. Finally, we provide some guidelines regarding how to select the most appropriate constrainthandling technique for a certain application, ad we conclude with some of the the most promising paths of future research in this area.
What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation
 Machine Learning
, 1993
"... Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the stru ..."
Abstract

Cited by 100 (3 self)
 Add to MetaCart
Abstract. What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the structure of a given fitness function when it is expressed as a Walsh polynomial. The work of Bethke, Goldberg, and others has produced certain theoretical results about this relationship. In this article we review these theoretical results, and then discuss a number of seemingly anomalous experimental results reported by Tanese concerning the performance of the GA on a subclass of Walsh polynomials, some members of which were expected to be easy for the GA to optimize. Tanese found that the GA was poor at optimizing all functions in this subclass, that a partitioning of a single large population into a number of smaller independent populations seemed to improve performance, and that hillclimbing outperformed both the original and partitioned forms of the GA on these functions. These results seemed to contradict several commonly held expectations about GAs. We begin by reviewing schema processing in GAs. We then give an informal description of how Walsh analysis and Bethke's Walshschema transform relate to GA performance, and we discuss the relevance of this analysis for GA applications in optimization and machine learning. We then describe Tanese's surprising results, examine them experimentally and theoretically, and propose and evaluate some explanations. These explanations lead to a more fundamental question about GAs: what are the features of problems that determine the likelihood of successful GA performance?
Fundamental Principles of Deception in Genetic Search
 Foundations of Genetic Algorithms
, 1991
"... This paper presents several theorems concerning the nature of deception and the central role that deception plays in function optimization using genetic algorithms. A simple proof is offered which shows that the only problems which pose challenging optimization tasks are problems that involve so ..."
Abstract

Cited by 95 (5 self)
 Add to MetaCart
This paper presents several theorems concerning the nature of deception and the central role that deception plays in function optimization using genetic algorithms. A simple proof is offered which shows that the only problems which pose challenging optimization tasks are problems that involve some degree of deception and which result in conflicting karm bandit competitions between hyperplanes. The concept of a deceptive attractor is introduced and shown to be more general than the deceptive optimum found in the deceptive functions that have been constructed to date. Also introduced are the concepts of fully deceptive problems as well as less strict consistently deceptive problems. A proof is given showing that deceptive attractors must have a complementary bit pattern to that found in the binary representation of the global optimum if a function is to be either fully deceptive or consistently deceptive. Some empirical results are presented which demonstrate different methods of dealing with deception and poor linkage during genetic search.
The Troubling Aspects of a Building Block Hypothesis for Genetic Programming
, 1992
"... In this paper we carefully formulate a Schema Theorem for Genetic Programming (GP) using a schema definition that accounts for the variable length and the nonhomologous nature of GP's representation. In a manner similar to early GA research, we use interpretations of our GP Schema Theorem to obtain ..."
Abstract

Cited by 82 (2 self)
 Add to MetaCart
In this paper we carefully formulate a Schema Theorem for Genetic Programming (GP) using a schema definition that accounts for the variable length and the nonhomologous nature of GP's representation. In a manner similar to early GA research, we use interpretations of our GP Schema Theorem to obtain a GP Building Block definition and to state a "classical" Building Block Hypothesis (BBH): that GP searches by hierarchically combining building blocks. We report that this approach is not convincing for several reasons: it is difficult to find support for the promotion and combination of building blocks solely by rigourous interpretation of a GP Schema Theorem; even if there were such support for a BBH, it is empirically questionable whether building blocks always exist because partial solutions of consistently above average fitness and resilience to disruption are not assured; also, a BBH constitutes a narrow and imprecise account of GP search behavior.