Results 1 - 10
of
42
Hierarchical Bayesian Optimization Algorithm = Bayesian Optimization Algorithm + Niching + Local Structures
, 2001
"... The paper describes the hierarchical Bayesian optimization algorithm which combines the Bayesian optimization algorithm, local structures in Bayesian networks, and a powerful niching technique. The proposed algorithm is able to solve hierarchical traps and other difficult problems very efficiently. ..."
Abstract
-
Cited by 329 (70 self)
- Add to MetaCart
(Show Context)
The paper describes the hierarchical Bayesian optimization algorithm which combines the Bayesian optimization algorithm, local structures in Bayesian networks, and a powerful niching technique. The proposed algorithm is able to solve hierarchical traps and other difficult problems very efficiently.
Escaping Hierarchical Traps with Competent Genetic Algorithms
- Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2001
, 2001
"... To solve hierarchical problems, one must be able to learn the linkage, represent partial solutions efficiently, and assure effective niching. We propose the hierarchical ... ..."
Abstract
-
Cited by 101 (49 self)
- Add to MetaCart
(Show Context)
To solve hierarchical problems, one must be able to learn the linkage, represent partial solutions efficiently, and assure effective niching. We propose the hierarchical ...
Bayesian Optimization Algorithm: From Single Level to Hierarchy
, 2002
"... There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decompositi ..."
Abstract
-
Cited by 101 (19 self)
- Add to MetaCart
(Show Context)
There are four primary goals of this dissertation. First, design a competent optimization algorithm capable of learning and exploiting appropriate problem decomposition by sampling and evaluating candidate solutions. Second, extend the proposed algorithm to enable the use of hierarchical decomposition as opposed to decomposition on only a single level. Third, design a class of difficult hierarchical problems that can be used to test the algorithms that attempt to exploit hierarchical decomposition. Fourth, test the developed algorithms on the designed class of problems and several real-world applications. The dissertation proposes the Bayesian optimization algorithm (BOA), which uses Bayesian networks to model the promising solutions found so far and sample new candidate solutions. BOA is theoretically and empirically shown to be capable of both learning a proper decomposition of the problem and exploiting the learned decomposition to ensure robust and scalable search for the optimum across a wide range of problems. The dissertation then identifies important features that must be incorporated into the basic BOA to solve problems that are not decomposable on a single level, but that can still be solved by decomposition over multiple levels of difficulty. Hierarchical
Ant colony optimization for continuous domains
, 2008
"... In this paper we present an extension of ant colony optimization (ACO) to continuous domains. We show how ACO, which was initially developed to be a metaheuristic for combinatorial optimization, can be adapted to continuous optimization without any major conceptual change to its structure. We presen ..."
Abstract
-
Cited by 72 (5 self)
- Add to MetaCart
In this paper we present an extension of ant colony optimization (ACO) to continuous domains. We show how ACO, which was initially developed to be a metaheuristic for combinatorial optimization, can be adapted to continuous optimization without any major conceptual change to its structure. We present the general idea, implementation, and results obtained. We compare the results with those reported in the literature for other continuous optimization methods: other ant-related approaches and other metaheuristics initially developed for combinatorial optimization and later adapted to handle the continuous case. We discuss how our extended ACO compares to those algorithms, and we present some analysis of its efficiency and robustness.
Hierarchical BOA Solves Ising Spin Glasses and MAXSAT
- In Proc. of the Genetic and Evolutionary Computation Conference (GECCO 2003), number 2724 in LNCS
, 2003
"... Theoretical and empirical evidence exists that the hierarchical Bayesian optimization algorithm (hBOA) can solve challenging hierarchical problems and anything easier. This paper applies hBOA to two important classes of real-world problems: Ising spin-glass systems and maximum satis ability (MAX ..."
Abstract
-
Cited by 56 (19 self)
- Add to MetaCart
(Show Context)
Theoretical and empirical evidence exists that the hierarchical Bayesian optimization algorithm (hBOA) can solve challenging hierarchical problems and anything easier. This paper applies hBOA to two important classes of real-world problems: Ising spin-glass systems and maximum satis ability (MAXSAT). The paper shows how easy it is to apply hBOA to realworld optimization problems. The results indicate that hBOA is capable of solving enormously dicult problems that cannot be solved by other optimizers and still provide competitive or better performance than problem-speci c approaches on other problems. The results thus con- rm that hBOA is a practical, robust, and scalable technique for solving challenging real-world problems.
Probabilistic Model Building and Competent Genetic Programming
- GENETIC PROGRAMMING THEORY AND PRACTISE, CHAPTER 13
, 2003
"... This paper describes a probabilistic model building genetic programming (PMBGP) developed based on the extended compact genetic algorithm (eCGA). Unlike traditional genetic programming, which use fixed recombination operators, the proposed PMBGA adapts linkages. The proposed algorithms... ..."
Abstract
-
Cited by 47 (10 self)
- Add to MetaCart
This paper describes a probabilistic model building genetic programming (PMBGP) developed based on the extended compact genetic algorithm (eCGA). Unlike traditional genetic programming, which use fixed recombination operators, the proposed PMBGA adapts linkages. The proposed algorithms...
Fitness inheritance in the Bayesian optimization algorithm
, 2004
"... This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems whe ..."
Abstract
-
Cited by 33 (23 self)
- Add to MetaCart
(Show Context)
This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions, but also for estimating their fitness. The results indicate that fitness inheritance is a promising concept in BOA, because population-sizing requirements for building appropriate models of promising solutions lead to good fitness estimates even if only a small proportion of candidate solutions is evaluated using the actual fitness function. This can lead to a reduction of the number of actual fitness evaluations by a factor of 30 or more.
Parallel estimation of distribution algorithms
, 2002
"... The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion ..."
Abstract
-
Cited by 26 (4 self)
- Add to MetaCart
The thesis deals with the new evolutionary paradigm based on the concept of Estimation of Distribution Algorithms (EDAs) that use probabilistic model of promising solutions found so far to obtain new candidate solutions of optimized problem. There are six primary goals of this thesis: 1. Suggestion of a new formal description of EDA algorithm. This high level concept can be used to compare the generality of various probabilistic models by comparing the properties of underlying mappings. Also, some convergence issues are discussed and theoretical ways for further improvements are proposed. 2. Development of new probabilistic model and methods capable of dealing with continuous parameters. The resulting Mixed Bayesian Optimization Algorithm (MBOA) uses a set of decision trees to express the probability model. Its main advantage against the mostly used IDEA and EGNA approach is its backward compatibility with discrete domains, so it is uniquely capable of learning linkage between mixed continuous-discrete genes. MBOA handles the discretization of continuous parameters as an integral part of the learning process, which outperforms the histogram-based
Efficiency enhancement of genetic algorithms via building-block-wise fitness estimation
- PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION
, 2004
"... This paper studies fitness inheritance as an efficiency enhancement technique for a class of competent genetic algorithms called estimation distribution algorithms. Probabilistic models of important sub-solutions are developed to estimate the fitness of a proportion of individuals in the population, ..."
Abstract
-
Cited by 23 (17 self)
- Add to MetaCart
This paper studies fitness inheritance as an efficiency enhancement technique for a class of competent genetic algorithms called estimation distribution algorithms. Probabilistic models of important sub-solutions are developed to estimate the fitness of a proportion of individuals in the population, thereby avoiding computationally expensive function evaluations. The effect of fitness inheritance on the convergence time and population sizing are modeled and the speed-up obtained through inheritance is predicted. The results show that a fitness-inheritance mechanism which utilizes information on building-block fitnesses provides significant efficiency enhancement. For additively separable problems, fitness inheritance reduces the number of function evaluations to about half and yields a speed-up of about 1.75–2.25.
Advancing Continuous IDEAs with Mixture Distributions and Factorization Selection Metrics
- Proceedings of the Optimization by Building and Using Probabilistic Models OBUPM Workshop at the Genetic and Evolutionary Computation Conference GECCO–2001
, 2001
"... Evolutionary optimization based on proba- bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be in ..."
Abstract
-
Cited by 23 (8 self)
- Add to MetaCart
(Show Context)
Evolutionary optimization based on proba- bilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter n was required previously to construct factorizations to prevent unnecessary complexity to be introduced in the factorization. In this paper, we advance these techniques by using clustering and the EM algorithm to allow for mixture distributions.