Results 1  10
of
20
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
(Show Context)
Parameterless hierarchical BOA
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2004), Part II, LNCS 3103
, 2004
"... An automated technique has recently been proposed to transfer learning in the hierarchical Bayesian optimization algorithm (hBOA) based on distancebased statistics. The technique enables practitioners to improve hBOA efficiency by collecting statistics from probabilistic models obtained in previous ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
An automated technique has recently been proposed to transfer learning in the hierarchical Bayesian optimization algorithm (hBOA) based on distancebased statistics. The technique enables practitioners to improve hBOA efficiency by collecting statistics from probabilistic models obtained in previous hBOA runs and using the obtained statistics to bias future hBOA runs on similar problems. The purpose of this paper is threefold: (1) test the technique on several classes of NPcomplete problems, including MAXSAT, spin glasses and minimum vertex cover; (2) demonstrate that the technique is effective even when previous runs were done on problems of different size; (3) provide empirical evidence that combining transfer learning with other efficiency enhancement techniques can often provide nearly multiplicative speedups.
Model Accuracy in the Bayesian Optimization Algorithm
, 2010
"... Evolutionary algorithms (EAs) are particularly suited to solve problems for which there is not much information available. From this standpoint, estimation of distribution algorithms (EDAs), which guide the search by using probabilistic models of the population, have brought a new view to evolutiona ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
Evolutionary algorithms (EAs) are particularly suited to solve problems for which there is not much information available. From this standpoint, estimation of distribution algorithms (EDAs), which guide the search by using probabilistic models of the population, have brought a new view to evolutionary computation. While solving a given problem with an EDA, the user has access to a set of models that reveal probabilistic dependencies between variables, an important source of information about the problem. However, as the complexity of the used models increases, the chance of overfitting and consequently reducing model interpretability, increases as well. This paper investigates the relationship between the probabilistic models learned by the Bayesian optimization algorithm (BOA) and the underlying problem structure. The purpose of the paper is threefold. First, model building in BOA is analyzed to understand how the problem structure is learned. Second, it is shown how the selection operator can lead to model overfitting in Bayesian EDAs. Third, the scoring metric that guides the search for an adequate model structure is modified to take into account the nonuniform distribution of the mating pool generated by tournament selection. Overall, this paper makes a contribution towards
Intelligent Bias of Network Structures in the Hierarchical BOA
, 2009
"... One of the primary advantages of estimation of distribution algorithms (EDAs) over many other stochastic optimization techniques is that they supply us with a roadmap of how they solve a problem. This roadmap consists of a sequence of probabilistic models of candidate solutions of increasing quality ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
One of the primary advantages of estimation of distribution algorithms (EDAs) over many other stochastic optimization techniques is that they supply us with a roadmap of how they solve a problem. This roadmap consists of a sequence of probabilistic models of candidate solutions of increasing quality. The first model in this sequence would typically encode the uniform distribution over all admissible solutions whereas the last model would encode a distribution that generates at least one global optimum with high probability. It has been argued that exploiting this knowledge should improve EDA performance when solving similar problems. This paper presents an approach to bias the building of Bayesian network models in the hierarchical Bayesian optimization algorithm (hBOA) using information gathered from models generated during previous hBOA runs on similar problems. The approach is evaluated on trap5 and 2D spin glass problems.
M.: Distancebased bias in modeldirected optimization of additively decomposable problems
, 2012
"... ar ..."
(Show Context)
Learn from the Past: Improving ModelDirected . . . Distancebased Bias
, 2012
"... For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic clus ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic cluster optimization. However, the use of such a metric for enhancing efficiency of optimization techniques is often not straightforward. This paper describes a framework that allows optimization practitioners to improve efficiency of modeldirected optimization techniques by combining such a distance metric with information mined from previous optimization runs on similar problems. The framework is demonstrated and empirically evaluated in the context of the hierarchical Bayesian optimization algorithm (hBOA). Experimental results provide strong empirical evidence that the proposed approach provides significant speedups and that it can be effectively combined with other efficiency enhancements. The paper demonstrates how straightforward it is to adapt the proposed framework to other modeldirected optimization techniques by presenting several examples.
DOI: 10.1016/j.engappai.2010.01.019 Open Archive Toulouse Archive Ouverte (OATAO)
, 2012
"... OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. ..."
Abstract
 Add to MetaCart
(Show Context)
OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible.
Combining Search Space Diagnostics and
"... Abstract—Stochastic optimisers such as Evolutionary Algorithms outperform random search due to their ability to exploit gradients in the search landscape, formed by the algorithm’s search operators in combination with the objective function. Research into the suitability of algorithmic approaches to ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—Stochastic optimisers such as Evolutionary Algorithms outperform random search due to their ability to exploit gradients in the search landscape, formed by the algorithm’s search operators in combination with the objective function. Research into the suitability of algorithmic approaches to problems has been made more tangible by the direct study and characterisation of the underlying fitness landscapes. Authors have devised metrics, such as the autocorrelation length, to help define these landscapes. In this work, we contribute the Predictive Diagnostic Optimisation method, a new localsearchbased algorithm which provides knowledge about the search space while it searches for the global optimum of a problem. It is a contribution to a less researched area which may be named Diagnostic Optimisation. I.