Results 1  10
of
17
Analyzing probabilistic models in hierarchical boa on traps and spin glasses
 Genetic and Evolutionary Computation Conference (GECCO2007), I
, 2007
"... The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common t ..."
Abstract

Cited by 25 (17 self)
 Add to MetaCart
(Show Context)
The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common test problems: concatenated traps and 2D Ising spin glasses with periodic boundary conditions. We argue that although Bayesian networks with local structures can encode complex probability distributions, analyzing these models in hBOA is relatively straightforward and the results of such analyses may provide practitioners with useful information about their problems. The results show that the probabilistic models in hBOA closely correspond to the structure of the underlying problem, the models do not change significantly in subsequent iterations of BOA, and creating adequate probabilistic models by hand is not straightforward even with complete knowledge of the optimization problem. Categories and Subject Descriptors
Using previous models to bias structural learning in the hierarchical BOA
, 2008
"... Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at l ..."
Abstract

Cited by 20 (11 self)
 Add to MetaCart
(Show Context)
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at least its accurate approximation, besides this, any EDA provides us with a sequence of probabilistic models, which in most cases hold a great deal of information about the problem. Although using problemspecific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of problemspecific information has been practically ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous hBOA runs on similar problems. We show that the proposed methods lead to substantial speedups and argue that the methods should work well in other applications that require solving a large number of problems with similar structure.
Parameterless hierarchical BOA
 Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2004), Part II, LNCS 3103
, 2004
"... An automated technique has recently been proposed to transfer learning in the hierarchical Bayesian optimization algorithm (hBOA) based on distancebased statistics. The technique enables practitioners to improve hBOA efficiency by collecting statistics from probabilistic models obtained in previous ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
An automated technique has recently been proposed to transfer learning in the hierarchical Bayesian optimization algorithm (hBOA) based on distancebased statistics. The technique enables practitioners to improve hBOA efficiency by collecting statistics from probabilistic models obtained in previous hBOA runs and using the obtained statistics to bias future hBOA runs on similar problems. The purpose of this paper is threefold: (1) test the technique on several classes of NPcomplete problems, including MAXSAT, spin glasses and minimum vertex cover; (2) demonstrate that the technique is effective even when previous runs were done on problems of different size; (3) provide empirical evidence that combining transfer learning with other efficiency enhancement techniques can often provide nearly multiplicative speedups.
Enhancing the Efficiency of The ECGA
 Proceedings of the X Parallel Problem Solving From Nature (PPSN2008
, 2008
"... Evolutionary Algorithms are largely used search and optimization procedures. They have been successfully applied for several problems and with proper care on the design process they can solve hard problems accurately, efficiently and reliably. The proper design of the algorithm turns some problems f ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Evolutionary Algorithms are largely used search and optimization procedures. They have been successfully applied for several problems and with proper care on the design process they can solve hard problems accurately, efficiently and reliably. The proper design of the algorithm turns some problems from intractable to tractable. We can go even further, using efficiency enhancements to turn them from tractable to practical. In this paper we show preliminary results of two efficiency enhancements proposed for Extended Compact Genetic Algorithm. First, a model building enhancement was used to reduce the complexity of the process from O(n 3) to O(n 2), speeding up the algorithm by 1000 times on a 4096 bits problem. Then, a localsearch hybridization was used to reduce the population size by at least 32 times, reducing the memory and running time required by the algorithm. These results draw the first steps toward a competent and efficient Genetic Algorithm.
Order or not: Does parallelization of model building in hBOA affect its scalability
, 2006
"... It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generat ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generated ordering. However, it has not been thoroughly shown that this approach to restricting probabilistic models does not affect scalability of hBOA on important classes of problems. This paper demonstrates that although the use of a random ancestral ordering restricts the structure of considered models to allow efficient parallelization of model building, its effects on hBOA performance and scalability are negligible.
ECGA vs. BOA in discovering stock market trading experts
 Genetic and Evolutionary Computation Conference (GECCO2007
, 2007
"... This paper presents two evolutionary algorithms, ECGA and BOA, applied to constructing stock market trading expertise, which is built on the basis of a set of specific trading rules analysing financial time series of recent price quotations. A few modifications of ECGA are proposed in order to reduc ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
This paper presents two evolutionary algorithms, ECGA and BOA, applied to constructing stock market trading expertise, which is built on the basis of a set of specific trading rules analysing financial time series of recent price quotations. A few modifications of ECGA are proposed in order to reduce the computing time and make the algorithm applicable for realtime trading. In experiments carried out on real data from the Paris Stock Exchange, the algorithms were compared in terms of the efficiency in solving the optimization problem, in terms of the financial relevance of the investment strategies discovered as well as in terms of the computing time.
Transfer learning, soft distancebased bias, and the hierarchical boa
, 2012
"... ar ..."
(Show Context)
Learn from the Past: Improving ModelDirected . . . Distancebased Bias
, 2012
"... For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic clus ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic cluster optimization. However, the use of such a metric for enhancing efficiency of optimization techniques is often not straightforward. This paper describes a framework that allows optimization practitioners to improve efficiency of modeldirected optimization techniques by combining such a distance metric with information mined from previous optimization runs on similar problems. The framework is demonstrated and empirically evaluated in the context of the hierarchical Bayesian optimization algorithm (hBOA). Experimental results provide strong empirical evidence that the proposed approach provides significant speedups and that it can be effectively combined with other efficiency enhancements. The paper demonstrates how straightforward it is to adapt the proposed framework to other modeldirected optimization techniques by presenting several examples.
IEEE TRANSACTION ON EVOLUTIONARY COMPUTATION 1 Transforming Evolutionary Search Into HigherLevel Evolutionary Search by Capturing Problem Structure
"... Abstract—The intuitive idea that good solutions to small problems can be reassembled into good solutions to larger problems is widely familiar in many fields including evolutionary computation. This idea has motivated the buildingblock hypothesis and modelbuilding optimisation methods that aim to ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—The intuitive idea that good solutions to small problems can be reassembled into good solutions to larger problems is widely familiar in many fields including evolutionary computation. This idea has motivated the buildingblock hypothesis and modelbuilding optimisation methods that aim to identify and exploit problem structure automatically. Recently, a small number of works make use of such ideas by learning problem structure and using this information in a particular manner: these works use the results of a simple search process in primitive units to identify structural correlations (such as modularity) in the problem that are then used to redefine the variational operators of the search process. This process is applied recursively such that search operates at successively higher scales of organisation, hence multiscale search. Here we show for the first time that there is a simple class of (modular) problems that a multiscale search algorithm can solve in polynomial time that requires superpolynomial time for other methods. We discuss strengths and limitations of the multiscale search approach and point out how it can be developed further. Index Terms—Evolutionary computation, automatic problem decomposition, linkagelearning, modularity, scalability I.