Results 1  10
of
32
Let’s get ready to rumble: Crossover versus mutation head to head
 In GECCO ’04: Proc. of the Genetic and Evolutionary Computation Conference
, 2004
"... This paper analyzes the relative advantages between crossover and mutation on a class of deterministic and stochastic additively separable problems. This study assumes that the recombination and mutation operators have the knowledge of the building blocks (BBs) and effectively exchange or search amo ..."
Abstract

Cited by 28 (20 self)
 Add to MetaCart
(Show Context)
This paper analyzes the relative advantages between crossover and mutation on a class of deterministic and stochastic additively separable problems. This study assumes that the recombination and mutation operators have the knowledge of the building blocks (BBs) and effectively exchange or search among competing BBs. Facetwise models of convergence time and population sizing have been used to determine the scalability of each algorithm. The analysis shows that for additively separable deterministic problems, the BBwise mutation is more efficient than crossover, while the crossover outperforms the mutation on additively separable problems perturbed with additive Gaussian noise. The results show that the speedup of using BBwise mutation on deterministic problems is O ( √ k log m), where k is the BB size, and m is the number of BBs. Likewise, the speedup of using crossover on stochastic problems with fixed noise variance is O(m √ k / logm). 1
Analyzing probabilistic models in hierarchical boa on traps and spin glasses
 Genetic and Evolutionary Computation Conference (GECCO2007), I
, 2007
"... The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common t ..."
Abstract

Cited by 25 (17 self)
 Add to MetaCart
(Show Context)
The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common test problems: concatenated traps and 2D Ising spin glasses with periodic boundary conditions. We argue that although Bayesian networks with local structures can encode complex probability distributions, analyzing these models in hBOA is relatively straightforward and the results of such analyses may provide practitioners with useful information about their problems. The results show that the probabilistic models in hBOA closely correspond to the structure of the underlying problem, the models do not change significantly in subsequent iterations of BOA, and creating adequate probabilistic models by hand is not straightforward even with complete knowledge of the optimization problem. Categories and Subject Descriptors
Genetic Algorithms
, 2005
"... Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode
Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm
, 2006
"... This paper studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search ..."
Abstract

Cited by 20 (14 self)
 Add to MetaCart
This paper studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search. Additionally, a surrogate fitness model is considered to evaluate the improvement of the local search steps. The results show that performing substructural local search in BOA significatively reduces the number of generations necessary to converge to optimal solutions and thus provides substantial speedups.
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
Combining competent crossover and mutation operators: A probabilistic model building approach
 In
, 2005
"... This paper presents an approach to combine competent crossover and mutation operators via probabilistic model building. Both operators are based on the probabilistic model building procedure of the extended compact genetic algorithm (eCGA). The model sampling procedure of eCGA, which mimics the beha ..."
Abstract

Cited by 16 (8 self)
 Add to MetaCart
(Show Context)
This paper presents an approach to combine competent crossover and mutation operators via probabilistic model building. Both operators are based on the probabilistic model building procedure of the extended compact genetic algorithm (eCGA). The model sampling procedure of eCGA, which mimics the behavior of an idealized recombination— where the building blocks (BBs) are exchanged without disruption—is used as the competent crossover operator. On the other hand, a recently proposed BBwise mutation operator—which uses the BB partition information to perform local search in the BB space—is used as the competent mutation operator. The resulting algorithm, called hybrid extended compact genetic algorithm (heCGA), makes use of the problem decomposition information for (1) effective recombination of BBs and (2) effective local search in the BB neighborhood. The proposed approach is tested on different problems that combine the core of three well known problem difficulty dimensions: deception, scaling, and noise. The results show that, in the absence of domain knowledge, the hybrid approach is more robust than either singleoperatorbased approach.
Evaluation relaxation using substructural information and linear estimation
 In Keijzer, M., et al. (Eds.), Proceedings of the ACM SIGEVO Genetic and Evolutionary Computation Conference (GECCO2006
, 2006
"... The paper presents an evaluationrelaxation scheme where a fitness surrogate automatically adapts to the problem structure and the partial contributions of subsolutions to the fitness of an individual are estimated efficiently and accurately. In particular, the probabilistic model built by extended ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
The paper presents an evaluationrelaxation scheme where a fitness surrogate automatically adapts to the problem structure and the partial contributions of subsolutions to the fitness of an individual are estimated efficiently and accurately. In particular, the probabilistic model built by extended compact genetic algorithm is used to infer the structural form of the surrogate and a least squares method is used to estimate the coefficients of the surrogate. Using the surrogate avoids the need for expensive fitness evaluation for some of the solutions, and thereby yields significant efficiency enhancement. Results show that a surrogate, which automatically adapts to problem knowledge mined from probabilistic models, yields substantial speedup (1.75–3.1) on a class of boundedlydifficult additivelydecomposable problems with and without additive Gaussian noise. The speedup provided by the surrogate increases with the number of substructures, substructure complexity, and noisetosignal ratio.
Influence of selection and replacement strategies on linkage learning in BOA
 In Proceedings of 2007 IEEE Congress on Evolutionary Computation (CEC 2007
, 2007
"... ..."
(Show Context)
Efficiency enhancement of probabilistic model building algorithms
 In Proceedings of the Optimization by Building and Using Probabilistic Models Workshop at the Genetic and Evolutionary Computation Conference
, 2004
"... Abstract. This paper presents two different efficiencyenhancement techniques for probabilistic model building genetic algorithms. The first technique proposes the use of a mutation operator which performs local search in the subsolution neighborhood identified through the probabilistic model. The ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Abstract. This paper presents two different efficiencyenhancement techniques for probabilistic model building genetic algorithms. The first technique proposes the use of a mutation operator which performs local search in the subsolution neighborhood identified through the probabilistic model. The second technique proposes building and using an internal probabilistic model of the fitness along with the probabilistic model of variable interactions. The fitness values of some offspring are estimated using the probabilistic model, thereby avoiding computationally expensive function evaluations. The scalability of the aforementioned techniques are analyzed using facetwise models for convergence time and population sizing. The speedup obtained by each of the methods is predicted and verified with empirical results. The results show that for additively separable problems the competent mutation operator requires O ( √ k log m)—where k is the buildingblock size, and m is the number of building blocks—less function evaluations than its selectorecombinative counterpart. The results also show that the use of an internal probabilistic fitness model reduces the required number of function evaluations to as low as 110 % and yields a speedup of 2–50. 1
A Matrix Approach for Finding Extrema: PROBLEMS WITH MODULARITY, HIERARCHY, AND OVERLAP
, 2006
"... Unlike most simple textbook examples, the real world is full with complex systems, and researchers in many different fields are often confronted by problems arising from such systems. Simple heuristics or even enumeration works quite well on small and easy problems; however, to efficiently solve lar ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Unlike most simple textbook examples, the real world is full with complex systems, and researchers in many different fields are often confronted by problems arising from such systems. Simple heuristics or even enumeration works quite well on small and easy problems; however, to efficiently solve large and difficult problems, proper decomposition according to the complex system is the key. In this research project, investigating and analyzing interactions between components of complex systems shed some light on problem decomposition. By recognizing three barebone types of interactions—modularity, hierarchy, and overlap, theories and models are developed to dissect and inspect problem decomposition in the context of genetic algorithms. This dissertation presents a research project to develop a competent optimization method to solve boundedly difficult problems with modularity, hierarchy, and overlap by explicit problem decomposition. The proposed genetic algorithm design utilizes a matrix representation of an interaction graph to analyze and decompose the problem. The results from this thesis should benefit research both technically and scientifically. Technically, this thesis develops an automated dependency structure matrix clustering technique and utilizes it to design a competent blackbox problem solver. Scientifically, the explicit interaction