Results 1  10
of
25
The CMA Evolution Strategy: A Comparing Review
 STUDFUZZ
, 2006
"... Derived from the concept of selfadaptation in evolution strategies, the CMA (Covariance Matrix Adaptation) adapts the covariance matrix of a multivariate normal search distribution. The CMA was originally designed to perform well with small populations. In this review, the argument starts out with ..."
Abstract

Cited by 45 (16 self)
 Add to MetaCart
Derived from the concept of selfadaptation in evolution strategies, the CMA (Covariance Matrix Adaptation) adapts the covariance matrix of a multivariate normal search distribution. The CMA was originally designed to perform well with small populations. In this review, the argument starts out with large population sizes, reflecting recent extensions of the CMA algorithm. Commonalities and differences to continuous Estimation of Distribution Algorithms are analyzed. The aspects of reliability of the estimation, overall step size control, and independence from the coordinate system (invariance) become particularly important in small populations sizes. Consequently, performing the adaptation task with small populations is more intricate.
Parallel estimation of distribution algorithms
, 2002
"... The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to compare the generality of various probabilistic models by comparing the properties of underlying mappings. Also, some convergence issues are discussed and theoretical ways for further improvements are proposed. 2. Development of new probabilistic model and methods capable of dealing with continuous parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA approach is its backward compatibility with discrete domains, so it is uniquely capable of learning linkage between mixed continuousdiscrete genes. MBOA handles the discretization of continuous parameters as an integral part of the learning process, which outperforms the histogrambased
Advancing Continuous IDEAs with Mixture Distributions and Factorization Selection Metrics
 Proceedings of the Optimization by Building and Using Probabilistic Models OBUPM Workshop at the Genetic and Evolutionary Computation Conference GECCO–2001
, 2001
"... Evolutionary optimization based on proba bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be in ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Evolutionary optimization based on proba bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be introduced in the factorization. In this paper, we advance these techniques by using clustering and the EM algorithm to allow for mixture distributions.
An estimation of distribution algorithm with intelligent local search for rulebased nurse rostering
, 2007
"... ..."
MultiObjective Mixturebased Iterated Density Estimation Evolutionary Algorithms
 in Proceedings of the Genetic and Evolutionary Computation Conference. San Francisco,California
, 2001
"... We propose an algorithm for multiobjective optimization using a mixturebased iterated density estimation evolutionary algorithm (M IDE A). The M IDE A algorithm is a probabilistic model building evolutionary algorithm that constructs at each generation a mixture of factorized probability dis ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We propose an algorithm for multiobjective optimization using a mixturebased iterated density estimation evolutionary algorithm (M IDE A). The M IDE A algorithm is a probabilistic model building evolutionary algorithm that constructs at each generation a mixture of factorized probability distributions.
The CorrelationTriggered Adaptive Variance Scaling IDEA
 IN PROCEEDINGS OF THE 8TH CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION
, 2006
"... It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successf ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successful use of search distributions in EDAs are presented. Then, an adaptive variance scaling theme is introduced that aims at reducing the risk of premature convergence. Integrating the scheme into the iterated density–estimation evolutionary algorithm (IDEA) yields the correlationtriggered adaptive variance scaling IDEA (CTAVSIDEA). The CTAVSIDEA is compared to the original IDEA and the Evolution Strategy with Covariance Matrix Adaptation (CMAES) on a wide range of unimodal testproblems by means of a scalability analysis. It is found that the average number of fitness evaluations grows subquadratically with the dimensionality, competitively with the CMAES. In addition, CTAVSIDEA is indeed found to enlarge the class of problems that continuous EDAs can solve reliably.
Populationbased continuous optimization, probabilistic modelling and mean shift
 Evolutionary Computation
, 2005
"... Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view populationbased optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view populationbased optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, populationbased optimization in terms of a stochastic gradient descent on the KullbackLeibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the populationbased incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems. Keywords Probabilistic modelling, estimation of distribution algorithms, populationbased incremental
A Bayesian Algorithm for
 In Vitro Molecular Evolution of Pattern Classifiers, DNA Computing 10, LNCS 3384
, 2005
"... ..."
Evolutionary Continuous Optimization by Distribution Estimation with Variational Bayesian Independent Component Analyzers Mixture Model
 In Proceedings of Parallel Problem Solving from Nature VIII
, 2004
"... Abstract. In evolutionary continuous optimization by building and using probabilistic models, the multivariate Gaussian distribution and their variants or extensions such as the mixture of Gaussians have been used popularly. However, this Gaussian assumption is often violated in many real problems. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. In evolutionary continuous optimization by building and using probabilistic models, the multivariate Gaussian distribution and their variants or extensions such as the mixture of Gaussians have been used popularly. However, this Gaussian assumption is often violated in many real problems. In this paper, we propose a new continuous estimation of distribution algorithms (EDAs) with the variational Bayesian independent component analyzers mixture model (vbICAMM) for allowing any distribution to be modeled. We examine how this sophisticated density estimation technique has influence on the performance of the optimization by employing the same selection and population alternation schemes used in the previous EDAs. Our experimental results support that the presented EDAs achieve better performance than previous EDAs with ICA and Gaussian mixture or kernelbased approaches. 1