Results 1  10
of
16
On the convergence of a class of estimation of distribution algorithms, conditionally
 IEEE Trans. Evol. Comput
"... Abstract—We investigate the global convergence of estimation of distribution algorithms (EDAs). In EDAs, the distribution is estimated from a set of selected elements, i.e., the parent set, and then the estimated distribution model is used to generate new elements. In this paper, we prove that: 1) i ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
Abstract—We investigate the global convergence of estimation of distribution algorithms (EDAs). In EDAs, the distribution is estimated from a set of selected elements, i.e., the parent set, and then the estimated distribution model is used to generate new elements. In this paper, we prove that: 1) if the distribution of the new elements matches that of the parent set exactly, the algorithms will converge to the global optimum under three widely used selection schemes and 2) a factorized distribution algorithm converges globally under proportional selection. Index Terms—Convergence, estimation of distribution algorithms (EDAs), factorized distribution algorithms (FDA). I.
The CorrelationTriggered Adaptive Variance Scaling IDEA
 IN PROCEEDINGS OF THE 8TH CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION
, 2006
"... It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successf ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successful use of search distributions in EDAs are presented. Then, an adaptive variance scaling theme is introduced that aims at reducing the risk of premature convergence. Integrating the scheme into the iterated density–estimation evolutionary algorithm (IDEA) yields the correlationtriggered adaptive variance scaling IDEA (CTAVSIDEA). The CTAVSIDEA is compared to the original IDEA and the Evolution Strategy with Covariance Matrix Adaptation (CMAES) on a wide range of unimodal testproblems by means of a scalability analysis. It is found that the average number of fitness evaluations grows subquadratically with the dimensionality, competitively with the CMAES. In addition, CTAVSIDEA is indeed found to enlarge the class of problems that continuous EDAs can solve reliably.
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
A Survey of Linkage Learning Techniques in Genetic and Evolutionary Algorithms
, 2007
"... This paper reviews and summarizes existing linkage learning techniques for genetic and evolutionary algorithms in the literature. It first introduces the definition of linkage in both biological systems and genetic algorithms. Then, it discusses the importance for genetic and evolutionary algorithms ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
This paper reviews and summarizes existing linkage learning techniques for genetic and evolutionary algorithms in the literature. It first introduces the definition of linkage in both biological systems and genetic algorithms. Then, it discusses the importance for genetic and evolutionary algorithms to be capable of learning linkage, which is referred to as the relationship between decision variables. Existing linkage learning methods proposed in the literature are reviewed according to different facets of genetic and evolutionary algorithms, including the means to distinguish between good linkage and bad linkage, the methods to express or represent linkage, and the ways to store linkage information. Studies related to these linkage learning methods and techniques are also investigated in this survey.
Permutation Optimization by Iterated Estimation of Random Keys Marginal Product Factorizations
 Parallel Problem Solving from Nature  PPSN VII
, 2002
"... In IDEAs, the probability distribution of a selection of solutions is estimated each generation. From this probability distribution, new solutions are drawn. Through the probability distribution, various relations between problem variables can be exploited to achieve e#cient optimization. For pe ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In IDEAs, the probability distribution of a selection of solutions is estimated each generation. From this probability distribution, new solutions are drawn. Through the probability distribution, various relations between problem variables can be exploited to achieve e#cient optimization. For permutation optimization, only real valued probability distributions have been applied to a real valued encoding of permutations.
Evolutionary Continuous Optimization by Distribution Estimation with Variational Bayesian Independent Component Analyzers Mixture Model
 In Proceedings of Parallel Problem Solving from Nature VIII
, 2004
"... Abstract. In evolutionary continuous optimization by building and using probabilistic models, the multivariate Gaussian distribution and their variants or extensions such as the mixture of Gaussians have been used popularly. However, this Gaussian assumption is often violated in many real problems. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. In evolutionary continuous optimization by building and using probabilistic models, the multivariate Gaussian distribution and their variants or extensions such as the mixture of Gaussians have been used popularly. However, this Gaussian assumption is often violated in many real problems. In this paper, we propose a new continuous estimation of distribution algorithms (EDAs) with the variational Bayesian independent component analyzers mixture model (vbICAMM) for allowing any distribution to be modeled. We examine how this sophisticated density estimation technique has influence on the performance of the optimization by employing the same selection and population alternation schemes used in the previous EDAs. Our experimental results support that the presented EDAs achieve better performance than previous EDAs with ICA and Gaussian mixture or kernelbased approaches. 1
Exploiting Gradient Information in Continuous Iterated Density Estimation Evolutionary Algorithms
 Proceedings of the 13th Belgium–Netherlands Artificial Intelligence Conference BNAIC’01
, 2001
"... For continuous optimization problems, evolutionary algorithms (EAs) that build and use probabilistic models have obtained promising results. ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
For continuous optimization problems, evolutionary algorithms (EAs) that build and use probabilistic models have obtained promising results.
Learning structure illuminates black boxes  an introduction into Estimation of Distribution Algorithms
, 2006
"... This chapter serves as an introduction to estimation of distribution algorithms. Estimation of distribution algorithms are a new paradigm in evolutionary computation. Stateoftheart EDAs consistently outperform classical genetic algorithms on a broad range of problems. We review the fundamental p ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This chapter serves as an introduction to estimation of distribution algorithms. Estimation of distribution algorithms are a new paradigm in evolutionary computation. Stateoftheart EDAs consistently outperform classical genetic algorithms on a broad range of problems. We review the fundamental principles and algorithms that are necessary to understand EDA research. We focus on EDAs for the discrete and the continuous problem domains and discuss the differences between the two.
NichingEDA: Utilizing the Diversity Inside A Population of EDAs For Continuous Optimization
, 2008
"... Since the Estimation of Distribution Algorithms (EDAs) have been introduced, several single model based EDAs and mixture model based EDAs have been developed. Take Gaussian models as an example, EDAs based on single Gaussian distribution have good performance on solving simple unimodal functions and ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Since the Estimation of Distribution Algorithms (EDAs) have been introduced, several single model based EDAs and mixture model based EDAs have been developed. Take Gaussian models as an example, EDAs based on single Gaussian distribution have good performance on solving simple unimodal functions and multimodal functions whose landscape has an obvious trend towards the global optimum. But they have difficulties in solving multimodal functions with irregular landscapes, such as wide basins, flat plateaus and deep valleys. Gaussian mixture model based EDAs have been developed to remedy this disadvantage of single Gaussian based EDAs. A general framework NichingEDA is presented in this paper from a new perspective to boost single model based EDAs ’ performance. Through adopting a niching method and recombination operators in a population of EDAs, NichingEDA significantly boosts the traditional single model based EDAs’ performance by making use of the diversity inside the EDA population on hard problems without estimating a precise distribution. Our experimental studies have shown that NichingEDA is very effective for some hard global optimization problems, although its scalability to high dimensional functions needs improving. Analyses and discussions are presented to explain why NichingEDA performed well/poorly on certain benchmark functions.
Scaling Up Estimation of Distribution Algorithms for Continuous Optimization
"... were proposed, many attempts have been made to improve EDAs ’ performance in the context of global optimization. So far, the studies or applications of multivariate probabilistic model based EDAs in continuous domain are still mostly restricted to low dimensional problems. Traditional EDAs have diff ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
were proposed, many attempts have been made to improve EDAs ’ performance in the context of global optimization. So far, the studies or applications of multivariate probabilistic model based EDAs in continuous domain are still mostly restricted to low dimensional problems. Traditional EDAs have difficulties in solving higher dimensional problems because of the curse of dimensionality and their rapidly increasing computational costs. However, scaling up continuous EDAs for large scale optimization is still necessary, which is supported by the distinctive feature of EDAs: Because a probabilistic model is explicitly estimated, from the learnt model one can discover useful properties of the problem. Besides obtaining a good solution, understanding of the problem structure can be of great benefit, especially for black box optimization. We propose a novel EDA framework with Model Complexity Control (EDAMCC) to scale up continuous EDAs. By employing Weakly dependent variable Identification (WI) and Subspace Modeling (SM), EDAMCC shows significantly better performance than traditional EDAs on high dimensional problems. Moreover, the computational cost and the requirement of large population sizes can be reduced in EDAMCC. In addition to being able to find a good solution, EDAMCC can also provide useful problem structure characterizations. EDAMCC is the first successful instance of multivariate model based EDAs that can be effectively applied a general class of up to 500D problems. It also outperforms some newly developed algorithms designed specifically for large scale optimization. In order to understand the strength and weakness of EDAMCC, we have carried out extensive computational studies. Our results have revealed when EDAMCC is likely to outperform others on what kind of benchmark functions. Index Terms—Estimation of distribution algorithm, large scale optimization, model complexity control.