Results 1  10
of
49
Escaping Hierarchical Traps with Competent Genetic Algorithms
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2001
, 2001
"... To solve hierarchical problems, one must be able to learn the linkage, represent partial solutions efficiently, and assure effective niching. We propose the hierarchical ... ..."
Abstract

Cited by 101 (49 self)
 Add to MetaCart
(Show Context)
To solve hierarchical problems, one must be able to learn the linkage, represent partial solutions efficiently, and assure effective niching. We propose the hierarchical ...
Bayesian Optimization Algorithm: From Single Level to Hierarchy
, 2002
"... There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decompositi ..."
Abstract

Cited by 101 (19 self)
 Add to MetaCart
(Show Context)
There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decomposition as opposed to decomposition on only a single level. Third, design a class of difficult hierarchical problems that can be used to test the algorithms that attempt to exploit hierarchical decomposition. Fourth, test the developed algorithms on the designed class of problems and several realworld applications. The dissertation proposes the Bayesian optimization algorithm (BOA), which uses Bayesian networks to model the promising solutions found so far and sample new candidate solutions. BOA is theoretically and empirically shown to be capable of both learning a proper decomposition of the problem and exploiting the learned decomposition to ensure robust and scalable search for the optimum across a wide range of problems. The dissertation then identifies important features that must be incorporated into the basic BOA to solve problems that are not decomposable on a single level, but that can still be solved by decomposition over multiple levels of difficulty. Hierarchical
Ant colony optimization for continuous domains
, 2008
"... In this paper we present an extension of ant colony optimization (ACO) to continuous domains. We show how ACO, which was initially developed to be a metaheuristic for combinatorial optimization, can be adapted to continuous optimization without any major conceptual change to its structure. We presen ..."
Abstract

Cited by 72 (5 self)
 Add to MetaCart
In this paper we present an extension of ant colony optimization (ACO) to continuous domains. We show how ACO, which was initially developed to be a metaheuristic for combinatorial optimization, can be adapted to continuous optimization without any major conceptual change to its structure. We present the general idea, implementation, and results obtained. We compare the results with those reported in the literature for other continuous optimization methods: other antrelated approaches and other metaheuristics initially developed for combinatorial optimization and later adapted to handle the continuous case. We discuss how our extended ACO compares to those algorithms, and we present some analysis of its efficiency and robustness.
Hierarchical BOA Solves Ising Spin Glasses and MAXSAT
 In Proc. of the Genetic and Evolutionary Computation Conference (GECCO 2003), number 2724 in LNCS
, 2003
"... Theoretical and empirical evidence exists that the hierarchical Bayesian optimization algorithm (hBOA) can solve challenging hierarchical problems and anything easier. This paper applies hBOA to two important classes of realworld problems: Ising spinglass systems and maximum satis ability (MAX ..."
Abstract

Cited by 56 (19 self)
 Add to MetaCart
(Show Context)
Theoretical and empirical evidence exists that the hierarchical Bayesian optimization algorithm (hBOA) can solve challenging hierarchical problems and anything easier. This paper applies hBOA to two important classes of realworld problems: Ising spinglass systems and maximum satis ability (MAXSAT). The paper shows how easy it is to apply hBOA to realworld optimization problems. The results indicate that hBOA is capable of solving enormously dicult problems that cannot be solved by other optimizers and still provide competitive or better performance than problemspeci c approaches on other problems. The results thus con rm that hBOA is a practical, robust, and scalable technique for solving challenging realworld problems.
Bayesian Optimization Algorithm, Decision Graphs, and Occam's Razor
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2001), 519–526. Also IlliGAL
, 2001
"... This paper discusses the use of various scoring metrics in the Bayesian optimization algorithm (BOA) which uses Bayesian networks to model promising solutions and generate the new ones. The use of decision graphs in Bayesian networks to improve the performance of the BOA is proposed. To favor simple ..."
Abstract

Cited by 42 (23 self)
 Add to MetaCart
This paper discusses the use of various scoring metrics in the Bayesian optimization algorithm (BOA) which uses Bayesian networks to model promising solutions and generate the new ones. The use of decision graphs in Bayesian networks to improve the performance of the BOA is proposed. To favor simple models, a complexity measure is incorporated into the BayesianDirichlet metric for Bayesian networks with decision graphs. The presented modi cations are compared on a number of interesting problems.
Expanding From Discrete To Continuous Estimation Of Distribution Algorithms: The IDEA
 In Parallel Problem Solving From Nature  PPSN VI
, 2000
"... . The direct application of statistics to stochastic optimization based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, i ..."
Abstract

Cited by 41 (9 self)
 Add to MetaCart
(Show Context)
. The direct application of statistics to stochastic optimization based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, is a combination of the recombination and mutation steps used in evolutionary algorithms. We introduce the framework named IDEA to formalize this notion. By combining continuous probability theory with techniques from existing algorithms, this framework allows us to dene new continuous evolutionary optimization algorithms. 1 Introduction Algorithms in evolutionary optimization guide their search through statistics based on a vector of samples, often called a population. By using this stochastic information, non{deterministic induction is performed in order to attempt to use the structure of the search space and thereby aid the search for the optimal solution. In order to perform induct...
Parallel estimation of distribution algorithms
, 2002
"... The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to compare the generality of various probabilistic models by comparing the properties of underlying mappings. Also, some convergence issues are discussed and theoretical ways for further improvements are proposed. 2. Development of new probabilistic model and methods capable of dealing with continuous parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA approach is its backward compatibility with discrete domains, so it is uniquely capable of learning linkage between mixed continuousdiscrete genes. MBOA handles the discretization of continuous parameters as an integral part of the learning process, which outperforms the histogrambased
Analyzing probabilistic models in hierarchical boa on traps and spin glasses
 Genetic and Evolutionary Computation Conference (GECCO2007), I
, 2007
"... The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common t ..."
Abstract

Cited by 25 (17 self)
 Add to MetaCart
(Show Context)
The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common test problems: concatenated traps and 2D Ising spin glasses with periodic boundary conditions. We argue that although Bayesian networks with local structures can encode complex probability distributions, analyzing these models in hBOA is relatively straightforward and the results of such analyses may provide practitioners with useful information about their problems. The results show that the probabilistic models in hBOA closely correspond to the structure of the underlying problem, the models do not change significantly in subsequent iterations of BOA, and creating adequate probabilistic models by hand is not straightforward even with complete knowledge of the optimization problem. Categories and Subject Descriptors
Advancing Continuous IDEAs with Mixture Distributions and Factorization Selection Metrics
 Proceedings of the Optimization by Building and Using Probabilistic Models OBUPM Workshop at the Genetic and Evolutionary Computation Conference GECCO–2001
, 2001
"... Evolutionary optimization based on proba bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be in ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
(Show Context)
Evolutionary optimization based on proba bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be introduced in the factorization. In this paper, we advance these techniques by using clustering and the EM algorithm to allow for mixture distributions.
Using previous models to bias structural learning in the hierarchical BOA
, 2008
"... Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at l ..."
Abstract

Cited by 20 (11 self)
 Add to MetaCart
(Show Context)
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at least its accurate approximation, besides this, any EDA provides us with a sequence of probabilistic models, which in most cases hold a great deal of information about the problem. Although using problemspecific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of problemspecific information has been practically ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous hBOA runs on similar problems. We show that the proposed methods lead to substantial speedups and argue that the methods should work well in other applications that require solving a large number of problems with similar structure.