Results 11  20
of
276
Rapid, Accurate Optimization of Difficult Problems Using Fast Messy Genetic Algorithms
 PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON GENETIC ALGORITHMS
, 1993
"... Researchers have long sought genetic algorithms (GAs) that can solve difficult search, optimization, and machine learning problems quickly. Despite years of work on simple GAs and their variants it is still unknown how difficult a problem simple GAs can solve, how quickly they can solve it, and with ..."
Abstract

Cited by 120 (23 self)
 Add to MetaCart
Researchers have long sought genetic algorithms (GAs) that can solve difficult search, optimization, and machine learning problems quickly. Despite years of work on simple GAs and their variants it is still unknown how difficult a problem simple GAs can solve, how quickly they can solve it, and with what reliability. More radical design departures than these have been taken, however, and the messy GA (mGA) approach has attempted to solve problems of bounded difficulty quickly and reliably by taking the notion of buildingblock linkage quite seriously. Early efforts were apparently successful in achieving polynomial convergence on some difficult problems, but the initialization bottleneck that required a large initial population was thought to be the primary obstacle to faster mGA performance. This paper replaces the partially enumerative initialization and selective primordial phase of the original messy GA with probabilistically complete initialization and a primordial phase that per...
A Framework for Evolutionary Optimization with Approximate Fitness Functions
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 2002
"... It is a common engineering practice to use approximate models instead of the original computationally expensive model in optimization. When an approximate model is used for evolutionary optimization, the convergence properties of the evolutionary algorithm are unclear due to the approximation error. ..."
Abstract

Cited by 117 (18 self)
 Add to MetaCart
(Show Context)
It is a common engineering practice to use approximate models instead of the original computationally expensive model in optimization. When an approximate model is used for evolutionary optimization, the convergence properties of the evolutionary algorithm are unclear due to the approximation error. In this paper, extensive empirical studies on convergence of an evolution strategy are carried out on two benchmark problems. It is found that incorrect convergence will occur if the approximate model has false optima. To address this problem, individual and generation based evolution control is introduced and the resulting effects on the convergence properties are presented. A framework for managing approximate models in generationbased evolution control is proposed. This framework is well suited for parallel evolutionary optimization that is able to guarantee the correct convergence of the evolutionary algorithm and to reduce the computation costs as much as possible. Control o...
Genetic Algorithms, Selection Schemes, and the Varying Effects of Noise
 EVOLUTIONARY COMPUTATION
, 1996
"... This paper analyzes the effect of noise on different selection mechanisms for genetic algorithms. Models for several selection scheme are developed that successfully predict the convergence characteristics of genetic algorithms within noisy environments. The selection schemes modeled in this paper i ..."
Abstract

Cited by 116 (8 self)
 Add to MetaCart
This paper analyzes the effect of noise on different selection mechanisms for genetic algorithms. Models for several selection scheme are developed that successfully predict the convergence characteristics of genetic algorithms within noisy environments. The selection schemes modeled in this paper include proportionate selection, tournament selection, ¯ selection, and linear ranking selection. These models are shown to accurately predict the convergence rate of genetic algorithms under a wide range of noise levels.
The Gene Expression Messy Genetic Algorithm
 In Proceedings of the IEEE International Conference on Evolutionary Computation
, 1996
"... This paper introduces the gene expression messy genetic algorithm (GEMGA) a new generation of messy GAs that directly search for relations among the members of the search space. The GEMGA is an O(Ak( 2 q k)) sam ple complexity algorithm for the class of orderk deline able problems [6] (problems ..."
Abstract

Cited by 102 (9 self)
 Add to MetaCart
This paper introduces the gene expression messy genetic algorithm (GEMGA) a new generation of messy GAs that directly search for relations among the members of the search space. The GEMGA is an O(Ak( 2 q k)) sam ple complexity algorithm for the class of orderk deline able problems [6] (problems that can be solved by considering no higher than orderk relations). The GEMGA is designed based on an alternate perspective of natural evo lution proposed by the SEARCH framework [6] that em phasizes the role of gene expression. The GEMGA uses the transcription operator to search for relations. This paper also presents the test results of the GEMGA for large multimodal orderk delineable problems.
Bayesian Optimization Algorithm: From Single Level to Hierarchy
, 2002
"... There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decompositi ..."
Abstract

Cited by 101 (19 self)
 Add to MetaCart
(Show Context)
There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decomposition as opposed to decomposition on only a single level. Third, design a class of difficult hierarchical problems that can be used to test the algorithms that attempt to exploit hierarchical decomposition. Fourth, test the developed algorithms on the designed class of problems and several realworld applications. The dissertation proposes the Bayesian optimization algorithm (BOA), which uses Bayesian networks to model the promising solutions found so far and sample new candidate solutions. BOA is theoretically and empirically shown to be capable of both learning a proper decomposition of the problem and exploiting the learned decomposition to ensure robust and scalable search for the optimum across a wide range of problems. The dissertation then identifies important features that must be incorporated into the basic BOA to solve problems that are not decomposable on a single level, but that can still be solved by decomposition over multiple levels of difficulty. Hierarchical
A Genetic Algorithm for Shortest Path Routing Problem and the Sizing of Populations
 IEEE Transactions on Evolutionary Computation
"... Abstract—This paper presents a genetic algorithmic approach to the shortest path (SP) routing problem. Variablelength chromosomes (strings) and their genes (parameters) have been used for encoding the problem. The crossover operation exchanges partial chromosomes (partial routes) at positionally in ..."
Abstract

Cited by 91 (2 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents a genetic algorithmic approach to the shortest path (SP) routing problem. Variablelength chromosomes (strings) and their genes (parameters) have been used for encoding the problem. The crossover operation exchanges partial chromosomes (partial routes) at positionally independent crossing sites and the mutation operation maintains the genetic diversity of the population. The proposed algorithm can cure all the infeasible chromosomes with a simple repair function. Crossover and mutation together provide a search capability that results in improved quality of solution and enhanced rate of convergence. This paper also develops a populationsizing equation that facilitates a solution with desired quality. It is based on the gambler’s ruin model; the equation has been further enhanced and generalized, however. The equation relates the size of the population, the quality of solution, the cardinality of the alphabet, and other parameters of the proposed algorithm. Computer simulations show that the proposed algorithm exhibits a much better quality of solution (route optimality) and a much higher rate of convergence than other algorithms. The results are relatively independent of problem types (network sizes and topologies) for almost all source–destination pairs. Furthermore, simulation studies emphasize the usefulness of the populationsizing equation. The equation scales to larger networks. It is felt that it can be used for determining an adequate population size (for a desired quality of solution) in the SP routing problem. Index Terms—Gambler’s ruin model, genetic algorithms, population size, shortest path routing problem. I.
A Summary of Research on Parallel Genetic Algorithms
, 1995
"... The main goal of this paper is to summarize the previous research on parallel genetic algorithms. We present an extension to previous categorizations of the parallelization techniques used in this field. We will use this categorization to guide us through a review of many of the most important publi ..."
Abstract

Cited by 76 (2 self)
 Add to MetaCart
The main goal of this paper is to summarize the previous research on parallel genetic algorithms. We present an extension to previous categorizations of the parallelization techniques used in this field. We will use this categorization to guide us through a review of many of the most important publications. We will build on this survey to try to identify some of the problems that have not been studied systematically yet. 1 Introduction Genetic Algorithms (GAs) are efficient search methods based on principles of natural selection and population genetics. They are being successfully applied to problems in business, engineering and science (Goldberg, 1994). GAs use randomized operators operating over a population of candidate solutions to generate new points in the search space. In the past few years, parallel genetic algorithms (PGAs) have been used to solve difficult problems. Hard problems need a bigger population and this translates directly into higher computational costs. The basic...
Learning Gene Linkage to Efficiently Solve Problems of Bounded Difficulty Using Genetic Algorithms
 UNIVERSITY OF MICHIGAN, ANN ARBOR
, 1997
"... The complicated nature of modern scientific endeavors often times requires the employment of blackbox optimization. For the past twenty years, the simple genetic algorithm (sGA) has proven to be a fertile inspiration for such techniques. Yet, many attempts to improve or adapt the sGA remain discon ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
The complicated nature of modern scientific endeavors often times requires the employment of blackbox optimization. For the past twenty years, the simple genetic algorithm (sGA) has proven to be a fertile inspiration for such techniques. Yet, many attempts to improve or adapt the sGA remain disconnected with its prevailing theory. This theory suggests that the sGA works by propagating building blockshighly fit similarities in the structure of its solutionsand that it can fail by not recombining these building blocks in one optimal solution. The most successful of previous attempts to facilitate building block recombination have strayed far from the operation of the sGA, resulting in techniques that are difficult to use and implement. This dissertation presents an approach to solving the recombination problem witho...
Evolutionary Algorithms in Noisy Environments: Theoretical Issues and Guidelines for Practice
 Computer Methods in Applied Mechanics and Engineering
, 1998
"... This paper is devoted to the effects of fitness noise in EAs (evolutionary algorithms). After a short introduction to the history of this research field, the performance of GAs (genetic algorithms) and ESs (evolution strategies) on the hypersphere test function is evaluated. It will be shown that t ..."
Abstract

Cited by 71 (6 self)
 Add to MetaCart
(Show Context)
This paper is devoted to the effects of fitness noise in EAs (evolutionary algorithms). After a short introduction to the history of this research field, the performance of GAs (genetic algorithms) and ESs (evolution strategies) on the hypersphere test function is evaluated. It will be shown that the main effects of noise  the decrease of convergence velocity and the residual location error R1  are observed in both GAs and ESs.
Evaluationrelaxation schemes for genetic and evolutionary algorithms
, 2002
"... Genetic and evolutionary algorithms have been increasingly applied to solve complex, large scale search problems with mixed success. Competent genetic algorithms have been proposed to solve hard problems quickly, reliably and accurately. They have rendered problems that were difficult to solve by th ..."
Abstract

Cited by 68 (27 self)
 Add to MetaCart
(Show Context)
Genetic and evolutionary algorithms have been increasingly applied to solve complex, large scale search problems with mixed success. Competent genetic algorithms have been proposed to solve hard problems quickly, reliably and accurately. They have rendered problems that were difficult to solve by the earlier GAs to be solvable, requiring only a subquadratic number of function evaluations. To facilitate solving largescale complex problems, and to further enhance the performance of competent GAs, various efficiencyenhancement techniques have been developed. This study investigates one such class of efficiencyenhancement technique called evaluation relaxation. Evaluationrelaxation schemes replace a highcost, lowerror fitness function with a lowcost, higherror fitness function. The error in fitness functions comes in two flavors: Bias and variance. The presence of bias and variance in fitness functions is considered in isolation and strategies for increasing efficiency in both cases are developed. Specifically, approaches for choosing between two fitness functions with either differing variance or differing bias values have been developed. This thesis also investigates fitness inheritance as an evaluation