Results 11  20
of
44
Parallel estimation of distribution algorithms
, 2002
"... The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to compare the generality of various probabilistic models by comparing the properties of underlying mappings. Also, some convergence issues are discussed and theoretical ways for further improvements are proposed. 2. Development of new probabilistic model and methods capable of dealing with continuous parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA approach is its backward compatibility with discrete domains, so it is uniquely capable of learning linkage between mixed continuousdiscrete genes. MBOA handles the discretization of continuous parameters as an integral part of the learning process, which outperforms the histogrambased
An Algorithmic Framework For Density Estimation Based Evolutionary Algorithms
, 1999
"... The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application are ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application area so far has mostly been the same as for the classic genetic algorithms, being the binary vector encoded problems. The most important aspect in the new algorithms is the part where probability densities are estimated. In probability theory, a distinction is made between discrete and continuous distributions and methods. Using the rationale for density estimation based evolutionary algorithms, we present an algorithmic framework for them, named IDEA. This allows us to define such algorithms for vectors of both continuous and discrete random variables, combining techniques from existing EDAs as well as density estimation theory. The emphasis is on techniques for vectors of continuous random variables, for which we present new algorithms in the field of density estimation based evolutionary algorithms, using two different density estimation models.
Genetic Algorithms
, 2005
"... Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Genetic algorithms (GAs) are search methods based on principles of natural selection and genetics (Fraser, 1957; Bremermann, 1958; Holland, 1975). We start with a brief introduction to simple genetic algorithms and associated terminology. GAs encode
On Stability of Fixed Points of Limit Models of Univariate Marginal Distribution Algorithm and Factorized Distribution Algorithm
 IEEE Trans. on Evolutionary Computation, Accepted
, 2003
"... Abstract—This paper aims to study the advantages of using higher order statistics in estimation distribution of algorithms (EDAs). We study two EDAs with twotournament selection for discrete optimization problems. One is the univariate marginal distribution algorithm (UMDA) using only firstorder s ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
(Show Context)
Abstract—This paper aims to study the advantages of using higher order statistics in estimation distribution of algorithms (EDAs). We study two EDAs with twotournament selection for discrete optimization problems. One is the univariate marginal distribution algorithm (UMDA) using only firstorder statistics and the other is the factorized distribution algorithm (FDA) using higher order statistics. We introduce the heuristic functions and the limit models of these two algorithms and analyze stability of these limit models. It is shown that the limit model of UMDA can be trapped at any local optimal solution for some initial probability models. However, degenerate probability density functions (pdfs) at some local optimal solutions are unstable in the limit model of FDA. In particular, the degenerate pdf at the global optimal solution is the unique asymptotically stable point in the limit model of FDA for the optimization of an additively decomposable function. Our results suggest that using higher order statistics could improve the chance of finding the global optimal solution. Index Terms—Estimation of distribution algorithms (EDAs), factorized distribution algorithm (FDA), heuristic function, stability, univariate marginal distribution algorithm (UMDA). I.
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
(Show Context)
MultiObjective Bayesian Optimization Algorithm
 in Proceedings of the Genetic and Evolutionary Computation Conference
, 2002
"... This paper proposes a competent multiobjective genetic algorithm called the multiobjective Bayesian optimization algorithm (mBOA). mBOA incorporates the selection method of the nondominated sorting genetic algorithmII (NSGAII) into the Bayesian optimization algorithm (BOA). The proposed algorith ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
This paper proposes a competent multiobjective genetic algorithm called the multiobjective Bayesian optimization algorithm (mBOA). mBOA incorporates the selection method of the nondominated sorting genetic algorithmII (NSGAII) into the Bayesian optimization algorithm (BOA). The proposed algorithm has been tested on an array of test functions which incorporate deception and looselinkage and the results are compared to those of NSGAII. Results indicate that mBOA outperforms NSGAII on large loosely linked deceptive problems.
Sporadic model building for efficiency enhancement of hierarchical BOA
, 2007
"... Efficiency enhancement techniques—such as parallelization and hybridization—are among the most important ingredients of practical applications of genetic and evolutionary algorithms and that is why this research area represents an important niche of evolutionary computation. This paper describes and ..."
Abstract

Cited by 17 (9 self)
 Add to MetaCart
(Show Context)
Efficiency enhancement techniques—such as parallelization and hybridization—are among the most important ingredients of practical applications of genetic and evolutionary algorithms and that is why this research area represents an important niche of evolutionary computation. This paper describes and analyzes sporadic model building, which can be used to enhance the efficiency of the hierarchical Bayesian optimization algorithm (hBOA) and other estimation of distribution algorithms (EDAs) that use complex multivariate probabilistic models. With sporadic model building, the structure of the probabilistic model is updated once in every few iterations (generations), whereas in the remaining iterations, only model parameters (conditional and marginal probabilities) are updated. Since the time complexity of updating model parameters is much lower than the time complexity of learning the model structure, sporadic model building decreases the overall time complexity of model building. The paper shows that for boundedly difficult nearly decomposable and hierarchical optimization problems, sporadic model building leads to a significant modelbuilding speedup, which decreases the asymptotic time complexity of model building in hBOA by a factor of Θ(n 0.26) to Θ(n 0.5), where n is the problem size. On the other hand, sporadic model building also increases the number of evaluations until convergence; nonetheless, if model building is the bottleneck, the evaluation slowdown is insignificant compared to the gains in the asymptotic complexity of model building. The paper also presents a dimensional model to provide a heuristic for scaling the structurebuilding period, which is the only parameter of the proposed sporadic modelbuilding approach. The paper then tests the proposed method and the rule for setting the structurebuilding period on the problem of finding ground states of 2D and 3D Ising spin glasses.
Matching inductive search bias and problem structure in continuous estimation of distribution algorithms
 European Journal of Operational Research
"... Research into the dynamics of Genetic Algorithms (GAs) has led to the ¯eld of Estimation{of{Distribution Algorithms (EDAs). For discrete search spaces, EDAs have been developed that have obtained very promising results on a wide variety of problems. In this paper we investigate the conditions under ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
(Show Context)
Research into the dynamics of Genetic Algorithms (GAs) has led to the ¯eld of Estimation{of{Distribution Algorithms (EDAs). For discrete search spaces, EDAs have been developed that have obtained very promising results on a wide variety of problems. In this paper we investigate the conditions under which the adaptation of this technique to continuous search spaces fails to perform optimization e±ciently. We show that without careful interpretation and adaptation of lessons learned from discrete EDAs, continuous EDAs will fail to perform e±cient optimization on even some of the simplest problems. We reconsider the most important lessons to be learned in the design of EDAs and subsequently show how we can use this knowledge to extend continuous EDAs that were obtained by straightforward adaptation from the discrete domain so as to obtain an improvement in performance. Experimental results are presented to illustrate this improvement and to additionally con¯rm experimentally that a proper adaptation of discrete EDAs to the continuous case indeed requires careful consideration. Key words: Estimation{of{distribution algorithms; Numerical optimization;
Multiobjective hBOA, clustering, and scalability
 In Proceedings of the Genetic and Evolutionary Computation Conference GECCO2005
, 2005
"... This paper describes a scalable algorithm for solving multiobjective decomposable problems by combining the hierarchical Bayesian optimization algorithm (hBOA) with the nondominated sorting genetic algorithm (NSGAII) and clustering in the objective space. It is first argued that for good scalabilit ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
This paper describes a scalable algorithm for solving multiobjective decomposable problems by combining the hierarchical Bayesian optimization algorithm (hBOA) with the nondominated sorting genetic algorithm (NSGAII) and clustering in the objective space. It is first argued that for good scalability, clustering or some other form of niching in the objective space is necessary and the size of each niche should be approximately equal. Multiobjective hBOA (mohBOA) is then described that combines hBOA, NSGAII and clustering in the objective space. The algorithm mohBOA differs from the multiobjective variants of BOA and hBOA proposed in the past by including clustering in the objective space and allocating an approximately equally sized portion of the population to each cluster. The algorithm mohBOA is shown to scale up well on a number of problems on which standard multiobjective evolutionary algorithms perform poorly.