Results 11  20
of
87
Modelbased search for combinatorial optimization
, 2001
"... Abstract In this paper we introduce modelbased search as a unifying framework accommodating some recently proposed heuristics for combinatorial optimization such as ant colony optimization, stochastic gradient ascent, crossentropy and estimation of distribution methods. We discuss similarities as ..."
Abstract

Cited by 46 (13 self)
 Add to MetaCart
Abstract In this paper we introduce modelbased search as a unifying framework accommodating some recently proposed heuristics for combinatorial optimization such as ant colony optimization, stochastic gradient ascent, crossentropy and estimation of distribution methods. We discuss similarities as well as distinctive features of each method, propose some extensions and present a comparative experimental study of these algorithms. 1
Optimization by learning and simulation of Bayesian and Gaussian networks
, 1999
"... Estimation of Distribution Algorithms (EDA) constitute an example of stochastics heuristics based on populations of individuals every of which encode the possible solutions to the optimization problem. These populations of individuals evolve in succesive generations as the search progresses  organ ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
Estimation of Distribution Algorithms (EDA) constitute an example of stochastics heuristics based on populations of individuals every of which encode the possible solutions to the optimization problem. These populations of individuals evolve in succesive generations as the search progresses  organized in the same way as most evolutionary computation heuristics. In opposition to most evolutionary computation paradigms which consider the crossing and mutation operators as essential tools to generate new populations, EDA replaces those operators by the estimation and simulation of the joint probability distribution of the selected individuals. In this work, after making a review of the different approaches based on EDA for problems of combinatorial optimization as well as for problems of optimization in continuous domains, we propose new approaches based on the theory of probabilistic graphical models to solve problems in both domains. More precisely, we propose to adapt algorit...
Feature Subset Selection by Bayesian networks: a comparison with genetic and sequential algorithms
"... In this paper we perform a comparison among FSSEBNA, a randomized, populationbased and evolutionary algorithm, and two genetic and other two sequential search approaches in the well known Feature Subset Selection (FSS) problem. In FSSEBNA, the FSS problem, stated as a search problem, uses the E ..."
Abstract

Cited by 43 (15 self)
 Add to MetaCart
In this paper we perform a comparison among FSSEBNA, a randomized, populationbased and evolutionary algorithm, and two genetic and other two sequential search approaches in the well known Feature Subset Selection (FSS) problem. In FSSEBNA, the FSS problem, stated as a search problem, uses the EBNA (Estimation of Bayesian Network Algorithm) search engine, an algorithm within the EDA (Estimation of Distribution Algorithm) approach. The EDA paradigm is born from the roots of the GA community in order to explicitly discover the relationships among the features of the problem and not disrupt them by genetic recombination operators. The EDA paradigm avoids the use of recombination operators and it guarantees the evolution of the population of solutions and the discovery of these relationships by the factorization of the probability distribution of best individuals in each generation of the search. In EBNA, this factorization is carried out by a Bayesian network induced by a chea...
Scalability Problems of Simple Genetic Algorithms
 Evolutionary Computation
, 1999
"... Scalable evolutionary computation has become an intensively studied research topic in recent years. The issue of scalability is predominant in any field of algorithmic design, but it became particularly relevant for the design of competent genetic algorithms once the scalability problems of simpl ..."
Abstract

Cited by 38 (4 self)
 Add to MetaCart
Scalable evolutionary computation has become an intensively studied research topic in recent years. The issue of scalability is predominant in any field of algorithmic design, but it became particularly relevant for the design of competent genetic algorithms once the scalability problems of simple genetic algorithms were understood. Here we present some of the work that has aided in getting a clear insight in the scalability problems of simple genetic algorithms. Particularly, we discuss the important issue of building block mixing. We show how the need for mixing places a boundary in the GA parameter space that, together with the boundary from the schema theorem, delimits the region where the GA converges reliably to the optimum in problems of bounded difficulty. This region shrinks rapidly with increasing problem size unless the building blocks are tightly linked in the problem coding structure. In addition, we look at how straightforward extensions of the simple genetic a...
Bayesian Optimization Algorithm, Decision Graphs, and Occam's Razor
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2001), 519–526. Also IlliGAL
, 2001
"... This paper discusses the use of various scoring metrics in the Bayesian optimization algorithm (BOA) which uses Bayesian networks to model promising solutions and generate the new ones. The use of decision graphs in Bayesian networks to improve the performance of the BOA is proposed. To favor simple ..."
Abstract

Cited by 37 (20 self)
 Add to MetaCart
This paper discusses the use of various scoring metrics in the Bayesian optimization algorithm (BOA) which uses Bayesian networks to model promising solutions and generate the new ones. The use of decision graphs in Bayesian networks to improve the performance of the BOA is proposed. To favor simple models, a complexity measure is incorporated into the BayesianDirichlet metric for Bayesian networks with decision graphs. The presented modi cations are compared on a number of interesting problems.
Fast Probabilistic Modeling for Combinatorial Optimization
 AAAI98
, 1998
"... Probabilistic models have recently been utilized for the optimization of large combinatorial search problems. However, complex probabilistic models that attempt to capture interparameter dependencies can have prohibitive computational costs. The algorithm presented in this paper, termed COMIT, provi ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
Probabilistic models have recently been utilized for the optimization of large combinatorial search problems. However, complex probabilistic models that attempt to capture interparameter dependencies can have prohibitive computational costs. The algorithm presented in this paper, termed COMIT, provides a method for using probabilistic models in conjunction with fast search techniques. We show how COMIT can be used with two very different fast search algorithms: hillclimbing and Populationbased incremental learning (PBIL). The resulting algorithms maintain many of the benefits of probabilistic modeling, with far less computational expense. Extensive empirical results are provided; COMIT has been successfully applied to jobshop scheduling, traveling salesman, and knapsack problems. This paper also presents a review of probabilistic modeling for combinatorial optimization.
Expanding From Discrete To Continuous Estimation Of Distribution Algorithms: The IDEA
 In Parallel Problem Solving From Nature  PPSN VI
, 2000
"... . The direct application of statistics to stochastic optimization based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, i ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
. The direct application of statistics to stochastic optimization based on iterated density estimation has become more important and present in evolutionary computation over the last few years. The estimation of densities over selected samples and the sampling from the resulting distributions, is a combination of the recombination and mutation steps used in evolutionary algorithms. We introduce the framework named IDEA to formalize this notion. By combining continuous probability theory with techniques from existing algorithms, this framework allows us to dene new continuous evolutionary optimization algorithms. 1 Introduction Algorithms in evolutionary optimization guide their search through statistics based on a vector of samples, often called a population. By using this stochastic information, non{deterministic induction is performed in order to attempt to use the structure of the search space and thereby aid the search for the optimal solution. In order to perform induct...
Realvalued Evolutionary Optimization using a Flexible Probability Density Estimator
 Proceedings of the GECCO 1999 Genetic and Evolutionary Computation Conference
, 1999
"... PopulationBased Incremental Learning (PBIL) is an abstraction of a genetic algorithm, which solves optimization problems by explicitly constructing a probabilistic model of the promising regions of the search space. At each iteration the model is used to generate a population of candidate sol ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
PopulationBased Incremental Learning (PBIL) is an abstraction of a genetic algorithm, which solves optimization problems by explicitly constructing a probabilistic model of the promising regions of the search space. At each iteration the model is used to generate a population of candidate solutions and is itself modified in response to these solutions. Through the extension of PBIL to Realvalued search spaces, a more powerful and general algorithmic framework arises which enables the use of arbitrary probability density estimation techniques in evolutionary optimization. To illustrate the usefulness of the framework, we propose and implement an evolutionary algorithm which uses a finite Adaptive Gaussian mixture model density estimator. This method offers considerable power and flexibility in the forms of the density which can be effectively modeled. We discuss the general applicability of the framework, and suggest that future work should lead to the developmen...
An Algorithmic Framework For Density Estimation Based Evolutionary Algorithms
, 1999
"... The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application are ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application area so far has mostly been the same as for the classic genetic algorithms, being the binary vector encoded problems. The most important aspect in the new algorithms is the part where probability densities are estimated. In probability theory, a distinction is made between discrete and continuous distributions and methods. Using the rationale for density estimation based evolutionary algorithms, we present an algorithmic framework for them, named IDEA. This allows us to define such algorithms for vectors of both continuous and discrete random variables, combining techniques from existing EDAs as well as density estimation theory. The emphasis is on techniques for vectors of continuous random variables, for which we present new algorithms in the field of density estimation based evolutionary algorithms, using two different density estimation models.
Grammar Modelbased Program Evolution
 In Proceedings of the 2004 IEEE Congress on Evolutionary Computation
, 2004
"... In Evolutionary Computation, genetic operators, such as mutation and crossover, are employed to perturb individuals to generate the next population. However these fixed, problem independent genetic operators may destroy the subsolution, usually called building blocks, instead of discovering and pres ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
In Evolutionary Computation, genetic operators, such as mutation and crossover, are employed to perturb individuals to generate the next population. However these fixed, problem independent genetic operators may destroy the subsolution, usually called building blocks, instead of discovering and preserving them. One way to overcome this problem is to build a model based on the good individuals, and sample this model to obtain the next population. There is a wide range of such work in Genetic Algorithms