Results 1  10
of
202
PopulationBased Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
, 1994
"... Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within th ..."
Abstract

Cited by 298 (11 self)
 Add to MetaCart
Genetic algorithms (GAs) are biologically motivated adaptive systems which have been used, with varying degrees of success, for function optimization. In this study, an abstraction of the basic genetic algorithm, the Equilibrium Genetic Algorithm (EGA), and the GA in turn, are reconsidered within the framework of competitive learning. This new perspective reveals a number of different possibilities for performance improvements. This paper explores populationbased incremental learning (PBIL), a method of combining the mechanisms of a generational genetic algorithm with simple competitive learning. The combination of these two methods reveals a tool which is far simpler than a GA, and which outperforms a GA on large set of optimization problems in terms of both speed and accuracy. This paper presents an empirical analysis of where the proposed technique will outperform genetic algorithms, and describes a class of problems in which a genetic algorithm may be able to perform better. Extensions to this algorithm are discussed and analyzed. PBIL and extensions are compared with a standard GA on twelve problems, including standard numerical optimization functions, traditional GA test suite problems, and NPComplete problems.
Genetic Algorithms, Noise, and the Sizing of Populations
 COMPLEX SYSTEMS
, 1991
"... This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average sig ..."
Abstract

Cited by 239 (85 self)
 Add to MetaCart
This paper considers the effect of stochasticity on the quality of convergence of genetic algorithms (GAs). In many problems, the variance of buildingblock fitness or socalled collateral noise is the major source of variance, and a populationsizing equation is derived to ensure that average signaltocollateralnoise ratios are favorable to the discrimination of the best building blocks required to solve a problem of bounded deception. The sizing relation is modified to permit the inclusion of other sources of stochasticity, such as the noise of selection, the noise of genetic operators, and the explicit noise or nondeterminism of the objective function. In a test suite of five functions, the sizing relation proves to be a conservative predictor of average correct convergence, as long as all major sources of noise are considered in the sizing calculation. These results suggest how the sizing equation may be viewed as a coarse delineation of a boundary between what a physicist might call two distinct phases of GA behavior. At low population sizes the GA makes many errors of decision, and the quality of convergence is largely left to the vagaries of chance or the serial fixup of flawed results through mutation or other serial injection of diversity. At large population sizes, GAs can reliably discriminate between good and bad building blocks, and parallel processing and recombination of building blocks lead to quick solution of even difficult deceptive problems. Additionally, the paper outlines a number of extensions to this work, including the development of more refined models of the relation between generational average error and ultimate convergence quality, the development of online methods for sizing populations via the estimation of populations...
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 191 (1 self)
 Add to MetaCart
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
MultiObjective Genetic Algorithms: Problem Difficulties and Construction of Test Problems
 Evolutionary Computation
, 1999
"... In this paper, we study the problem features that may cause a multiobjective genetic algorithm (GA) difficulty in converging to the true Paretooptimal front. Identification of such features helps us develop difficult test problems for multiobjective optimization. Multiobjective test problems ..."
Abstract

Cited by 153 (12 self)
 Add to MetaCart
In this paper, we study the problem features that may cause a multiobjective genetic algorithm (GA) difficulty in converging to the true Paretooptimal front. Identification of such features helps us develop difficult test problems for multiobjective optimization. Multiobjective test problems are constructed from singleobjective optimization problems, thereby allowing known difficult features of singleobjective problems (such as multimodality, isolation, or deception) to be directly transferred to the corresponding multiobjective problem. In addition, test problems having features specific to multiobjective optimization are also constructed. More importantly, these difficult test problems will enable researchers to test their algorithms for specific aspects of multiobjective optimization. Keywords Genetic algorithms, multiobjective optimization, niching, paretooptimality, problem difficulties, test problems. 1 Introduction After a decade since the pioneering wor...
Massive Multimodality, Deception, and Genetic Algorithms
, 1992
"... This paper considers the use of genetic algorithms (GAs) for the solution of problems that are both averagesense misleading (deceptive) and massively multimodal. An archetypical multimodaldeceptive problem, here called a bipolar deceptive problem, is defined and two generalized constructions of su ..."
Abstract

Cited by 113 (25 self)
 Add to MetaCart
This paper considers the use of genetic algorithms (GAs) for the solution of problems that are both averagesense misleading (deceptive) and massively multimodal. An archetypical multimodaldeceptive problem, here called a bipolar deceptive problem, is defined and two generalized constructions of such problems are reviewed, one using reflected trap functions and one using loworder Walsh coefficients; sufficient conditions for bipolar deception are also reviewed. The Walsh construction is then used to form a 30bit, ordersix bipolardeceptive function by concatenating five, sixbit bipolar functions. This test function, with over five million local optima and 32 global optima, poses a difficult challenge to simple and niched GAs alike. Nonetheless, simulations show that a simple GA can reliably find one of the 32 global optima if appropriate signaltonoiseratio population sizing is adopted. Simulations also demonstrate that a niched GA can reliably and simultaneously find all 32 global solutions if the population is roughly sized for the expected niche distribution and if the function is appropriately scaled to emphasize global solutions at the expense of suboptimal ones. These results immediately recommend the application of niched GAs using appropriate population sizing and scaling. They also suggest a number of avenues for generalizing the notion of deception.
The Science of Breeding and its Application to the Breeder Genetic Algorithm BGA
 EVOLUTIONARY COMPUTATION
, 1994
"... The Breeder Genetic Algorithm BGA models artificial selection as performed by human breeders. The science of breeding is based on advanced statistical methods. In this paper a connection between genetic algorithm theory and the science of breeding is made. We show how the response to selection eq ..."
Abstract

Cited by 100 (23 self)
 Add to MetaCart
The Breeder Genetic Algorithm BGA models artificial selection as performed by human breeders. The science of breeding is based on advanced statistical methods. In this paper a connection between genetic algorithm theory and the science of breeding is made. We show how the response to selection equation and the concept of heritability can be applied to predict the behavior of the BGA. Selection, recombination and mutation are analyzed within this framework. It is shown that recombination and mutation are complementary search operators. The theoretical results are obtained under the assumption of additive gene effects. For general fitness landscapes regression techniques for estimating the heritability are used to analyze and control the BGA. The method of decomposing the genetic variance into an additive and a nonadditive part connects the case of additive fitness functions with the general case.
Evaluating Evolutionary Algorithms
 Artificial Intelligence
, 1996
"... Test functions are commonly used to evaluate the effectiveness of different search algorithms. However, the results of evaluation are as dependent on the test problems as they are on the algorithms that are the subject of comparison. Unfortunately, developing a test suite for evaluating competing se ..."
Abstract

Cited by 86 (14 self)
 Add to MetaCart
Test functions are commonly used to evaluate the effectiveness of different search algorithms. However, the results of evaluation are as dependent on the test problems as they are on the algorithms that are the subject of comparison. Unfortunately, developing a test suite for evaluating competing search algorithms is difficult without clearly defined evaluation goals. In this paper we discuss some basic principles that can be used to develop test suites and we examine the role of test suites as they have been used to evaluate evolutionary search algorithms. Current test suites include functions that are easily solved by simple search methods such as greedy hillclimbers. Some test functions also have undesirable characteristics that are exaggerated as the dimensionality of the search space is increased. New methods are examined for constructing functions with different degrees of nonlinearity, where the interactions and the cost of evaluation scale with respect to the dimensionality of...
An Updated Survey of GABased Multiobjective Optimization Techniques
 ACM Computing Surveys
, 1998
"... this paper is to summarize and organize the information on these current approaches, emphasizing the importance of analyzing the Operations Research techniques in which most of them are based, in an attempt to motivate researchers to look into these mathematical programming approaches for new ways o ..."
Abstract

Cited by 77 (1 self)
 Add to MetaCart
this paper is to summarize and organize the information on these current approaches, emphasizing the importance of analyzing the Operations Research techniques in which most of them are based, in an attempt to motivate researchers to look into these mathematical programming approaches for new ways of exploiting the search capabilities of evolutionary algorithms. Furthermore, a summary of the main algorithms behind these approaches is provided, together with a brief criticism that includes their advantages and disadvantages, their degree of applicability and some of their known applications. Finally, the future trends in this area and some possible paths of further research are also addressed.
Parallel Recombinative Simulated Annealing: A Genetic Algorithm
, 1995
"... This paper introduces and analyzes a parallel method of simulated annealing. Borrowing from genetic algorithms, an effective combination of simulated annealing and genetic algorithms, called parallel recombinative simulated annealing, is developed. This new algorithm strives to retain the desirable ..."
Abstract

Cited by 74 (3 self)
 Add to MetaCart
This paper introduces and analyzes a parallel method of simulated annealing. Borrowing from genetic algorithms, an effective combination of simulated annealing and genetic algorithms, called parallel recombinative simulated annealing, is developed. This new algorithm strives to retain the desirable asymptotic convergence properties of simulated annealing, while adding the populations approach and recombinative power of genetic algorithms. The algorithm iterates a population of solutions rather than a single solution, employing a binary recombination operator as well as a unary neighborhood operator. Proofs of global convergence are given for two variations of the algorithm. Convergence behavior is examined, and empirical distributions are compared to Boltzmann distributions. Parallel recombinative simulated annealing is amenable to straightforward implementation on SIMD, MIMD, or sharedmemory machines. The algorithm, implemented on the CM5, is run repeatedly on two deceptive problems...
Fitness Landscape Analysis and Memetic Algorithms for the Quadratic Assignment Problem
, 1999
"... In this paper, a fitness landscape analysis for several instances of the quadratic assignment problem (QAP) is performed and the results are used to classify problem instances according to their hardness for local search heuristics and metaheuristics based on local search. The local properties of t ..."
Abstract

Cited by 62 (9 self)
 Add to MetaCart
In this paper, a fitness landscape analysis for several instances of the quadratic assignment problem (QAP) is performed and the results are used to classify problem instances according to their hardness for local search heuristics and metaheuristics based on local search. The local properties of the tness landscape are studied by performing an autocorrelation analysis, while the global structure is investigated by employing a fitness distance correlation analysis. It is shown that epistasis, as expressed by the dominance of the flow and distance matrices of a QAP instance, the landscape ruggedness in terms of the correlation length of a landscape, and the correlation between fitness and distance of local optima in the landscape together are useful for predicting the performance of memetic algorithms  evolutionary algorithms incorporating local search  to a certain extent. Thus, based on these properties a favorable choice of recombination and/or mutation operators can be found.