Results 1  10
of
26
A ParameterLess Genetic Algorithm
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1999
"... From the user's point of view, setting the parameters of a genetic algorithm (GA) is far from a trivial task. Moreover, the user is typically not interested in population sizes, crossover probabilities, selection rates, and other GA technicalities. He is just interested in solving a probl ..."
Abstract

Cited by 279 (34 self)
 Add to MetaCart
From the user's point of view, setting the parameters of a genetic algorithm (GA) is far from a trivial task. Moreover, the user is typically not interested in population sizes, crossover probabilities, selection rates, and other GA technicalities. He is just interested in solving a problem, and what he would really like to do, is to handin the problem to a blackbox algorithm, and simply press a start button. This paper explores the development of a GA that fulfills this requirement. It has no parameters whatsoever. The development of the algorithm takes into account several aspects of the theory of GAs, including previous research work on population sizing, the schema theorem, building block mixing, and genetic drift.
Niching Methods for Genetic Algorithms
, 1995
"... Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This ..."
Abstract

Cited by 232 (1 self)
 Add to MetaCart
(Show Context)
Niching methods extend genetic algorithms to domains that require the location and maintenance of multiple solutions. Such domains include classification and machine learning, multimodal function optimization, multiobjective function optimization, and simulation of complex and adaptive systems. This study presents a comprehensive treatment of niching methods and the related topic of population diversity. Its purpose is to analyze existing niching methods and to design improved niching methods. To achieve this purpose, it first develops a general framework for the modelling of niching methods, and then applies this framework to construct models of individual niching methods, specifically crowding and sharing methods. Using a constructed model of crowding, this study determines why crowding methods over the last two decades have not made effective niching methods. A series of tests and design modifications results in the development of a highly effective form of crowding, called determin...
Bayesian Optimization Algorithm: From Single Level to Hierarchy
, 2002
"... There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decompositi ..."
Abstract

Cited by 101 (19 self)
 Add to MetaCart
(Show Context)
There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decomposition as opposed to decomposition on only a single level. Third, design a class of difficult hierarchical problems that can be used to test the algorithms that attempt to exploit hierarchical decomposition. Fourth, test the developed algorithms on the designed class of problems and several realworld applications. The dissertation proposes the Bayesian optimization algorithm (BOA), which uses Bayesian networks to model the promising solutions found so far and sample new candidate solutions. BOA is theoretically and empirically shown to be capable of both learning a proper decomposition of the problem and exploiting the learned decomposition to ensure robust and scalable search for the optimum across a wide range of problems. The dissertation then identifies important features that must be incorporated into the basic BOA to solve problems that are not decomposable on a single level, but that can still be solved by decomposition over multiple levels of difficulty. Hierarchical
Finite Markov Chain Analysis of Genetic Algorithms with Niching
 Proceedings of the Fifth International Conference on Genetic Algorithms
, 1993
"... Finite, discretetime Markov chain models of genetic algorithms have been used successfully in the past to understand the complex dynamics of a simple GA. Markov chains can exactly model the GA by accounting for all of the stochasticity introduced by various GA operators, such as initialization, sel ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
(Show Context)
Finite, discretetime Markov chain models of genetic algorithms have been used successfully in the past to understand the complex dynamics of a simple GA. Markov chains can exactly model the GA by accounting for all of the stochasticity introduced by various GA operators, such as initialization, selection, crossover, and mutation. Although such models quickly become unwieldy with increasing population size or genome length, they provide initial insights that guide our development of approximate, scalable models. In this study, we use Markov chains to analyze the stochastic effects of the "niching operator" of a niched GA. Specifically, we model the effect of fitness sharing on a singlelocus genome. Without niching, our model is an absorbing Markov chain. With niching, we are dealing with a "quasiergodic" Markov chain. Rather than calculating expected times to absorption, we are interested in steadystate probabilities for positive recurrent states. Established techniques for analyzin...
Simple Analytical Models of Genetic Algorithms for Multimodal Function Optimization
 Urbana: Department of General Engineering, University of Illinois at UrbanaChampaign
, 1993
"... This paper presents simple analytical models of genetic algorithms which are commonly used in multimodal function optimization. The methodology for constructing the models is similar throughout the study. The predictive value of each model is verified by running the corresponding genetic algorithm o ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
(Show Context)
This paper presents simple analytical models of genetic algorithms which are commonly used in multimodal function optimization. The methodology for constructing the models is similar throughout the study. The predictive value of each model is verified by running the corresponding genetic algorithm on various multimodal functions of varying complexity.
Adaptive Niching via Coevolutionary Sharing
 In Genetic Algorithms and Evolution Strategy in Engineering and Computer Science (Chapter 2
, 1997
"... An adaptive niching scheme called coevolutionary shared niching (CSN) is proposed, implemented, analyzed and tested. The scheme overcomes the limitations of fixed sharing schemes by permitting the locations and radii of niches to adapt to complex landscapes, thereby permitting a better distribution ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
(Show Context)
An adaptive niching scheme called coevolutionary shared niching (CSN) is proposed, implemented, analyzed and tested. The scheme overcomes the limitations of fixed sharing schemes by permitting the locations and radii of niches to adapt to complex landscapes, thereby permitting a better distribution of solutions in problems with many badly spaced optima. The scheme takes its inspiration from the model of monopolistic competition in economics and utilizes two populations, a population of businessmen and a population of customers, where the locations of the businessmen correspond to niche locations and the locations of customers correspond to solutions. Initial results on straightforward test functions validate the distributional effectiveness of the basic scheme, although tests on a massively multimodal function do not find the best niches in the allotted time. This result spurs the design of an imprint mechanism that turns the best customers into businessmen, thereby making better use o...
The ParameterLess Genetic Algorithm: Rational And Automated Parameter Selection For Simplified Genetic Algorithm Operation
, 2000
"... Genetic algorithms (GAs) have been used to solve difficult optimization problems in a number of fields. One of the advantages of these algorithms is that they operate well even in domains where little is known, thus giving the GA the flavor of a general purpose problem solver. However, in order ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Genetic algorithms (GAs) have been used to solve difficult optimization problems in a number of fields. One of the advantages of these algorithms is that they operate well even in domains where little is known, thus giving the GA the flavor of a general purpose problem solver. However, in order to solve a problem with the GA, the user usually has to specify a number of parameters that have little to do with the user's problem, and have more to do with the way the GA operates. This dissertation presents a technique that greatly simplifies the GA operation by relieving the user from having to set these parameters. Instead, the parameters are set automatically by the algorithm itself. The validity of the approach is illustrated with artificial problems often used to test GA techniques, and also with a simplified version of a network expansion problem.
Genetic Algorithm in Search and Optimization: The Technique and Applications
 Proc. of Int. Workshop on Soft Computing and Intelligent Systems
, 1997
"... A genetic algorithm (GA) is a search and optimization method developed by mimicking the evolutionary principles and chromosomal processing in natural genetics. A GA begins its search with a random set of solutions usually coded in binary string structures. Every solution is assigned a fitness which ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
A genetic algorithm (GA) is a search and optimization method developed by mimicking the evolutionary principles and chromosomal processing in natural genetics. A GA begins its search with a random set of solutions usually coded in binary string structures. Every solution is assigned a fitness which is directly related to the objective function of the search and optimization problem. Thereafter, the population of solutions is modified to a new population by applying three operators similar to natural genetic operatorsreproduction, crossover, and mutation. A GA works iteratively by successively applying these three operators in each generation till a termination criterion is satisfied. Over the past one decade, GAs have been successfully applied to a wide variety of problems, because of their simplicity, global perspective, and inherent parallel processing. In this paper, we outline the working principle of a GA by describing these three operators and by outlining an intuitive sketch ...
Robust and Scalable BlackBox Optimization, Hierarchy and Ising Spin Glasses
, 2003
"... One of the most important challenges in computational optimization is the design of advanced blackbox optimization techniques that would enable automated, robust, and scalable solution to challenging optimization problems. This paper describes an advanced blackbox optimizer—the hierarchical Bayesi ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
One of the most important challenges in computational optimization is the design of advanced blackbox optimization techniques that would enable automated, robust, and scalable solution to challenging optimization problems. This paper describes an advanced blackbox optimizer—the hierarchical Bayesian optimization algorithm (hBOA)—that combines techniques of genetic and evolutionary computation, machine learning, and statistics to create a widely applicable tool for solving realworld optimization problems. The paper motivates hBOA, describes its basic procedure, and provides an indepth empirical analysis of hBOA on the class of random 2D and 3D Ising spin glass problems. The results on Ising spin glasses indicate that even without much problemspecific knowledge, hBOA can provide competitive or better results than techniques specialized in solving the particular problem or class of problems. Furthermore, hBOA can solve a large class of nearly decomposable and hierarchical problems for which there exists no other scalable solution. 1