Results 1  10
of
28
Designing competent mutation operators via probabilistic model building of neighborhoods
 In Deb, K., & et al. (Eds.), Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2004), Part II, LNCS 3103
, 2004
"... This paper presents a competent selectomutative genetic algorithm (GA), that adapts linkage and solves hard problems quickly, reliably, and accurately. A probabilistic model building process is used to automatically identify key building blocks (BBs) of the search problem. The mutation operator uses ..."
Abstract

Cited by 32 (20 self)
 Add to MetaCart
(Show Context)
This paper presents a competent selectomutative genetic algorithm (GA), that adapts linkage and solves hard problems quickly, reliably, and accurately. A probabilistic model building process is used to automatically identify key building blocks (BBs) of the search problem. The mutation operator uses the probabilistic model of linkage groups to find the best among competing building blocks. The competent selectomutative GA successfully solves additively separable problems of bounded difficulty, requiring only subquadratic number of function evaluations. The results show that for additively separable problems the probabilistic model building BBwise mutation scales as O(2 k m 1.5), and requires O ( √ k log m) less function evaluations than its selectorecombinative counterpart, confirming theoretical results reported elsewhere (Sastry & Goldberg, 2004). 1
Genetic Algorithms
, 2005
"... Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode
Using previous models to bias structural learning in the hierarchical BOA
, 2008
"... Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at l ..."
Abstract

Cited by 20 (11 self)
 Add to MetaCart
(Show Context)
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at least its accurate approximation, besides this, any EDA provides us with a sequence of probabilistic models, which in most cases hold a great deal of information about the problem. Although using problemspecific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of problemspecific information has been practically ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous hBOA runs on similar problems. We show that the proposed methods lead to substantial speedups and argue that the methods should work well in other applications that require solving a large number of problems with similar structure.
Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm
, 2006
"... This paper studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search ..."
Abstract

Cited by 20 (14 self)
 Add to MetaCart
This paper studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search. Additionally, a surrogate fitness model is considered to evaluate the improvement of the local search steps. The results show that performing substructural local search in BOA significatively reduces the number of generations necessary to converge to optimal solutions and thus provides substantial speedups.
Towards billion bit optimization via parallel estimation of distribution algorithm
 Genetic and Evolutionary Computation Conference (GECCO2007
, 2007
"... This paper presents a highly efficient, fully parallelized implementation of the compact genetic algorithm (cGA) to solve very large scale problems with millions to billions of variables. The paper presents principled results demonstrating the scalable solution of a difficult test function on instan ..."
Abstract

Cited by 18 (9 self)
 Add to MetaCart
(Show Context)
This paper presents a highly efficient, fully parallelized implementation of the compact genetic algorithm (cGA) to solve very large scale problems with millions to billions of variables. The paper presents principled results demonstrating the scalable solution of a difficult test function on instances over a billion variables using a parallel implementation of cGA. The problem addressed is a noisy, blind problem over a vector of binary decision variables. Noise is added equaling up to a tenth of the deterministic objective function variance of the problem, thereby making it difficult for simple hillclimbers to find the optimal solution. The compact GA, on the other hand, is able to find the optimum in the presence of noise quickly, reliably, and accurately, and the solution scalability follows known convergence theories. These results on noisy problem together with other results on problems involving varying modularity, hierarchy, and overlap foreshadow routine solution of billionvariable problems across the landscape of search problems.
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
Combining competent crossover and mutation operators: A probabilistic model building approach
 In
, 2005
"... This paper presents an approach to combine competent crossover and mutation operators via probabilistic model building. Both operators are based on the probabilistic model building procedure of the extended compact genetic algorithm (eCGA). The model sampling procedure of eCGA, which mimics the beha ..."
Abstract

Cited by 16 (8 self)
 Add to MetaCart
(Show Context)
This paper presents an approach to combine competent crossover and mutation operators via probabilistic model building. Both operators are based on the probabilistic model building procedure of the extended compact genetic algorithm (eCGA). The model sampling procedure of eCGA, which mimics the behavior of an idealized recombination— where the building blocks (BBs) are exchanged without disruption—is used as the competent crossover operator. On the other hand, a recently proposed BBwise mutation operator—which uses the BB partition information to perform local search in the BB space—is used as the competent mutation operator. The resulting algorithm, called hybrid extended compact genetic algorithm (heCGA), makes use of the problem decomposition information for (1) effective recombination of BBs and (2) effective local search in the BB neighborhood. The proposed approach is tested on different problems that combine the core of three well known problem difficulty dimensions: deception, scaling, and noise. The results show that, in the absence of domain knowledge, the hybrid approach is more robust than either singleoperatorbased approach.
Efficiency enhancement of probabilistic model building algorithms
 In Proceedings of the Optimization by Building and Using Probabilistic Models Workshop at the Genetic and Evolutionary Computation Conference
, 2004
"... Abstract. This paper presents two different efficiencyenhancement techniques for probabilistic model building genetic algorithms. The first technique proposes the use of a mutation operator which performs local search in the subsolution neighborhood identified through the probabilistic model. The ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
Abstract. This paper presents two different efficiencyenhancement techniques for probabilistic model building genetic algorithms. The first technique proposes the use of a mutation operator which performs local search in the subsolution neighborhood identified through the probabilistic model. The second technique proposes building and using an internal probabilistic model of the fitness along with the probabilistic model of variable interactions. The fitness values of some offspring are estimated using the probabilistic model, thereby avoiding computationally expensive function evaluations. The scalability of the aforementioned techniques are analyzed using facetwise models for convergence time and population sizing. The speedup obtained by each of the methods is predicted and verified with empirical results. The results show that for additively separable problems the competent mutation operator requires O ( √ k log m)—where k is the buildingblock size, and m is the number of building blocks—less function evaluations than its selectorecombinative counterpart. The results also show that the use of an internal probabilistic fitness model reduces the required number of function evaluations to as low as 110 % and yields a speedup of 2–50. 1
D.: Overcoming hierarchical difficulty by hillclimbing the building block structure
 Genetic and Evolutionary Computation Conference (GECCO2007) (2007) 1256–1263
"... The Building Block Hypothesis suggests that Genetic Algorithms (GAs) are wellsuited for hierarchical problems, where efficient solving requires proper problem decomposition and assembly of solution from subsolution with strong nonlinear interdependencies. The paper proposes a hillclimber operatin ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
The Building Block Hypothesis suggests that Genetic Algorithms (GAs) are wellsuited for hierarchical problems, where efficient solving requires proper problem decomposition and assembly of solution from subsolution with strong nonlinear interdependencies. The paper proposes a hillclimber operating over the building block (BB) space that can efficiently address hierarchical problems. The new Building Block HillClimber (BBHC) uses hillclimb search experience to learn the problem structure. The neighborhood structure is adapted whenever new knowledge about the underlying BB structure is incorporated into the search. This allows the method to climb the hierarchical structure by revealing and solving consecutively the hierarchical levels. It is expected that for fully nondeceptive hierarchical BB structures the BBHC can solve hierarchical problems in linearithmic time. Empirical results confirm that the proposed method scales almost linearly with the problem size thus clearly outperforms population based recombinative methods.
Fluctuating crosstalk as a source of deterministic noise and its effects on GA scalability
 APPLICATIONS OF EVOLUTIONARY COMPUTING EVOWORKSHOPS2006: EVOBIO, EVOCOMNET, EVOHOT, EVOIASP, EVOINTERACTION, EVOMUSART, EVOSTOCK
, 2006
"... This paper explores how fluctuating crosstalk in a deterministic fitness function introduces noise into genetic algorithms. We model fluctuating crosstalk or nonlinear interactions among building blocks via higherorder Walsh coefficients. The fluctuating crosstalk behaves like exogenous noise and c ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
This paper explores how fluctuating crosstalk in a deterministic fitness function introduces noise into genetic algorithms. We model fluctuating crosstalk or nonlinear interactions among building blocks via higherorder Walsh coefficients. The fluctuating crosstalk behaves like exogenous noise and can be handled by increasing the population size and run duration. This behavior holds until the strength of the crosstalk far exceeds the underlying fitness variance by a certain factor empirically observed. Our results also have implications for the relative performance of buildingblockwise mutation over crossover.