Results 1  10
of
21
A Survey of Optimization by Building and Using Probabilistic Models
 COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
, 1999
"... This paper summarizes the research on populationbased probabilistic search algorithms based on modeling promising solutions by estimating their probability distribution and using the constructed model to guide the further exploration of the search space. It settles the algorithms in the field of ge ..."
Abstract

Cited by 276 (82 self)
 Add to MetaCart
This paper summarizes the research on populationbased probabilistic search algorithms based on modeling promising solutions by estimating their probability distribution and using the constructed model to guide the further exploration of the search space. It settles the algorithms in the field of genetic and evolutionary computation where they have been originated. All methods are classified into a few classes according to the complexity of the class of models they use. Algorithms from each of these classes are briefly described and their strengths and weaknesses are discussed.
Extending PopulationBased Incremental Learning to Continuous Search Spaces
, 1998
"... . An alternative to Darwinianlike artificial evolution is offered by PopulationBased Incremental Learning (PBIL): this algorithm memorizes the best past individuals and uses this memory as a distribution, to generate the next population from scratch. This paper extends PBIL from boolean to con ..."
Abstract

Cited by 59 (3 self)
 Add to MetaCart
. An alternative to Darwinianlike artificial evolution is offered by PopulationBased Incremental Learning (PBIL): this algorithm memorizes the best past individuals and uses this memory as a distribution, to generate the next population from scratch. This paper extends PBIL from boolean to continuous search spaces. A Gaussian model is used for the distribution of the population. The center of this model is constructed as in boolean PBIL. Several ways of defining and adjusting the variance of the model are investigated. The approach is validated on several largesized problems. 1 Introduction Evolutionary algorithms (EAs) [13, 6, 5] are mostly used to find the optima of some fitness function F defined on a search space\Omega . F :\Omega ! IR From a machine learning (ML) perspective [9], evolution is similar to learning by query: Learning by query starts with a void hypothesis and gradually refines the current hypothesis through asking questions to some oracle. In ML, ...
Realvalued Evolutionary Optimization using a Flexible Probability Density Estimator
 Proceedings of the GECCO 1999 Genetic and Evolutionary Computation Conference
, 1999
"... PopulationBased Incremental Learning (PBIL) is an abstraction of a genetic algorithm, which solves optimization problems by explicitly constructing a probabilistic model of the promising regions of the search space. At each iteration the model is used to generate a population of candidate sol ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
PopulationBased Incremental Learning (PBIL) is an abstraction of a genetic algorithm, which solves optimization problems by explicitly constructing a probabilistic model of the promising regions of the search space. At each iteration the model is used to generate a population of candidate solutions and is itself modified in response to these solutions. Through the extension of PBIL to Realvalued search spaces, a more powerful and general algorithmic framework arises which enables the use of arbitrary probability density estimation techniques in evolutionary optimization. To illustrate the usefulness of the framework, we propose and implement an evolutionary algorithm which uses a finite Adaptive Gaussian mixture model density estimator. This method offers considerable power and flexibility in the forms of the density which can be effectively modeled. We discuss the general applicability of the framework, and suggest that future work should lead to the developmen...
Evolutionary Algorithm using Marginal Histogram Models in Continuous Domain
 in Continuous Domain, Proc. of the 2001 Genetic and Evolutionary Computation Conference Workshop Program
, 2001
"... In this paper, we propose an evolutionary algorithm using marginal histograms to model the parent population in a continuous domain. We propose two types of marginal histogram models: the fixedwidth histogram (FWH) and the fixedheight histogram (FHH). The results showed that both models worked fai ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
In this paper, we propose an evolutionary algorithm using marginal histograms to model the parent population in a continuous domain. We propose two types of marginal histogram models: the fixedwidth histogram (FWH) and the fixedheight histogram (FHH). The results showed that both models worked fairly well on test functions with no or weak interactions among variables. Especially, FHH could find the global optimum with very high accuracy effectively and showed good scaleup with the problem size.
Parallel estimation of distribution algorithms
, 2002
"... The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to compare the generality of various probabilistic models by comparing the properties of underlying mappings. Also, some convergence issues are discussed and theoretical ways for further improvements are proposed. 2. Development of new probabilistic model and methods capable of dealing with continuous parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA approach is its backward compatibility with discrete domains, so it is uniquely capable of learning linkage between mixed continuousdiscrete genes. MBOA handles the discretization of continuous parameters as an integral part of the learning process, which outperforms the histogrambased
Probabilistic ModelBuilding Genetic Algorithms in Permutation Representation Domain Using Edge Histogram
 Proc. of the 7th Int. Conf. on Parallel Problem Solving from Nature (PPSN VII
, 2002
"... Abstract. Recently, there has been a growing interest in developing evolutionary algorithms based on probabilistic modeling. In this scheme, the offspring population is generated according to the estimated probability density model of the parent instead of using recombination and mutation operators. ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
Abstract. Recently, there has been a growing interest in developing evolutionary algorithms based on probabilistic modeling. In this scheme, the offspring population is generated according to the estimated probability density model of the parent instead of using recombination and mutation operators. In this paper, we have proposed probabilistic modelbuilding genetic algorithms (PMBGAs) in permutation representation domain using edge histogram based sampling algorithms (EHBSAs). Two types of sampling algorithms, without template (EHBSA/WO) and with template (EHBSA/WT), are presented. The results were tested in the TSP and showed EHBSA/WT worked fairly well with a small population size in the test problems used. It also worked better than wellknown traditional twoparent recombination operators. 1
The CorrelationTriggered Adaptive Variance Scaling IDEA
 IN PROCEEDINGS OF THE 8TH CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION
, 2006
"... It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successf ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
It has previously been shown analytically and experimentally that continuous Estimation of Distribution Algorithms (EDAs) based on the normal pdf can easily suffer from premature convergence. This paper takes a principled first step towards solving this problem. First, prerequisites for the successful use of search distributions in EDAs are presented. Then, an adaptive variance scaling theme is introduced that aims at reducing the risk of premature convergence. Integrating the scheme into the iterated density–estimation evolutionary algorithm (IDEA) yields the correlationtriggered adaptive variance scaling IDEA (CTAVSIDEA). The CTAVSIDEA is compared to the original IDEA and the Evolution Strategy with Covariance Matrix Adaptation (CMAES) on a wide range of unimodal testproblems by means of a scalability analysis. It is found that the average number of fitness evaluations grows subquadratically with the dimensionality, competitively with the CMAES. In addition, CTAVSIDEA is indeed found to enlarge the class of problems that continuous EDAs can solve reliably.
Adaptive discretization for probabilistic model building genetic algorithms
 In Proceedings of ACM SIGEVO Genetic and Evolutionary Computation Conference (GECCO2006
, 2006
"... This paper proposes an adaptive discretization method, called SplitonDemand (SoD), to enable the probabilistic model building genetic algorithm (PMBGA) to solve optimization problems in the continuous domain. The procedure, effect, and usage of SoD are described in detail. As an example, the integ ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
This paper proposes an adaptive discretization method, called SplitonDemand (SoD), to enable the probabilistic model building genetic algorithm (PMBGA) to solve optimization problems in the continuous domain. The procedure, effect, and usage of SoD are described in detail. As an example, the integration of SoD and the extended compact genetic algorithm (ECGA), named realcoded ECGA (rECGA), is presented and numerically examined. The experimental results indicate that rECGA works well and SoD is effective. The behavior of SoD is analyzed and discussed, followed by the potential future work for SoD. 1
A Bayesian Algorithm for
 In Vitro Molecular Evolution of Pattern Classifiers, DNA Computing 10, LNCS 3384
, 2005
"... ..."