## Multi-Objective Mixture-based Iterated Density Estimation Evolutionary Algorithms (2001)

Venue: | in Proceedings of the Genetic and Evolutionary Computation Conference. San Francisco,California |

Citations: | 11 - 0 self |

### BibTeX

@INPROCEEDINGS{Thierens01multi-objectivemixture-based,

author = {Dirk Thierens and Peter A.N. Bosman},

title = {Multi-Objective Mixture-based Iterated Density Estimation Evolutionary Algorithms},

booktitle = {in Proceedings of the Genetic and Evolutionary Computation Conference. San Francisco,California},

year = {2001},

pages = {663--670},

publisher = {Morgan Kaufmann}

}

### OpenURL

### Abstract

We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (M IDE A). The M IDE A algorithm is a probabilistic model building evolutionary algorithm that constructs at each generation a mixture of factorized probability distributions.

### Citations

501 | Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach
- Zitzler, Thiele
- 1999
(Show Context)
Citation Context ...ously studied in the literature. 4 Experimental results 4.1 Multi-objective 0/1 knapsack problem Oursrst test function is a discrete multi-objective 0/1 knapsack problem taken from Zitzler and Thiele =-=[18-=-] who introduced it to compare a number of dierent multi-objective evolutionary algorithms. The problem is additionally interesting because of its real life practicality and the large string lengths i... |

403 | A fast élitist non dominated sorting genetic algorithm for multi objective optimization
- Deb, Agrawal, et al.
- 2000
(Show Context)
Citation Context ...It should also be noted that the Pareto front found insgure 9 seems to coincide with the optimal Pareto front, which is not trivial to achieve since the fast elitist non-dominated sorting GA (NSGA-II =-=[6]-=-), the strength Pareto Evolutionary Algorithm (SPEA [18]), and the Pareto-archived evolution strategy (PAES [10]) are all reported to converge to a sub-optimal front ([6]). In the experiments so far, ... |

278 | F.G.: A Survey of Optimization by Building and Using Probabilistic Models
- Pelikan, Goldberg, et al.
- 2002
(Show Context)
Citation Context ... the dependence structure of the problem variables. The exploration operators mutation and crossover are now replaced by generating new samples according to this probabilistic model (for a survey see =-=[16]-=-). In [2] we have given a general algorithmic framework for this paradigm called iterated density estimation evolutionary algorithm (IDEA). In this paper we will propose an algorithm for multi-objecti... |

255 | Hierarchical Bayesian Optimization Algorithm. Toward a New Generation of Evolutionary Algorithms
- Pelikan
- 2005
(Show Context)
Citation Context ...arning (see for instance [8]). In a similar eort to learn the structure of the problem representation a number of researchers have taken a more probabilistic view of the evolutionary search process ([=-=1, 2, 7, 9, 11, 13, 15, 14, 17]-=-). The general idea here is to build a probabilistic model of the current parent population and learn the structure of the problem representation by inducing the dependence structure of the problem va... |

231 | The compact genetic algorithm
- Harik, Lobo, et al.
- 1999
(Show Context)
Citation Context ...arning (see for instance [8]). In a similar eort to learn the structure of the problem representation a number of researchers have taken a more probabilistic view of the evolutionary search process ([=-=1, 2, 7, 9, 11, 13, 15, 14, 17]-=-). The general idea here is to build a probabilistic model of the current parent population and learn the structure of the problem representation by inducing the dependence structure of the problem va... |

215 | A comprehensive survey of evolutionary-based multiobjective optimization techniques
- Coello
- 1999
(Show Context)
Citation Context ... for multi-objective optimization algorithms following the covering strategy. Theseld of evolutionary multiobjective optimization has indeed seen an explosive growth in recent years (for a survey see =-=[4]-=-). 3 Multi-objective mixture-based IDEA The IDEA is a framework for Iterated Density Estimation Evolutionary Algorithms that uses probabilistic models to guide the evolutionary search [2]. A key chara... |

151 | Multi-objective genetic algorithms: Problem difficulties and construction of test problems, Evolutionary Computation
- Deb
- 1999
(Show Context)
Citation Context ...uous function optimization Next to the multi-objective 0/1 knapsack problem we also tested the M IDE A algorithm on multi-objective continuous function optimization problems taken from the literature =-=[-=-5]. First we will look at the mixture of Gaussian pdfs using learning conditional factorizations. Wesxed the selection size to bnc = 250 ( = 0:3, population size n = 834). Clustering is done using the... |

120 |
The Pareto archived evolution strategy: a new baseline algorithm for Pareto multiobjective optimization
- D, Corne
- 1999
(Show Context)
Citation Context ...ax i2f0;1;:::;nK 1g p i;j w i;j on beforehand and sorting the q j . The prots, weights and knapsack capacities are chosen as follows: p i;j and w i;j are random integers chosen from the interval [10=-=,100]-=-, while the capacities c i are set to half the items' weight in the corresponding knapsack: c i = 0:5 N I 1 X j=0 w i;j : This results in half of the items to be expected in the optimal solutions. We ... |

62 | FDA - a scalable evolutionary algorithms for the optimization of additively decomposed functions, Evolutionary Computation 7(4
- Mühlenbein, Mahnig
- 1999
(Show Context)
Citation Context ...arning (see for instance [8]). In a similar eort to learn the structure of the problem representation a number of researchers have taken a more probabilistic view of the evolutionary search process ([=-=1, 2, 7, 9, 11, 13, 15, 14, 17]-=-). The general idea here is to build a probabilistic model of the current parent population and learn the structure of the problem representation by inducing the dependence structure of the problem va... |

43 | Optimization by learning and simulation of bayesian and gaussian networks
- naga, Etxeberria, et al.
- 1999
(Show Context)
Citation Context |

30 | Expanding from discrete to continuous estimation of distribution algorithms: the IDEA
- Bosman, Thierens
- 2000
(Show Context)
Citation Context |

26 | Estimating dependency structure as a hidden variable
- Meila, Jordan, et al.
- 1997
(Show Context)
Citation Context ... simple enough to be computationally ecient is to model the domain variable interactions with a tree factorization. The model thus becomes a mixture of trees, a probability model recently proposed in =-=[12]-=-. The M IDE A tree can be viewed as a generalization of the optimal dependency tree algorithm [1] towards a mixture model and adapted for multi-objective problems. Interestingly, the use of a mixture ... |

23 |
Rapid, accurate optimization of di#cult problems using fast messy genetic algorithms
- Goldberg, Deb, et al.
- 1993
(Show Context)
Citation Context ... of experimental runs. An alternative to this labour intensive task is to try to learn the structure of the search landscape automatically, an approach often called linkage learning (see for instance =-=[8-=-]). In a similar eort to learn the structure of the problem representation a number of researchers have taken a more probabilistic view of the evolutionary search process ([1, 2, 7, 9, 11, 13, 15, 14,... |

21 | Genetic algorithms, clustering, and the breaking of symmetry
- Pelikan, Goldberg
- 2000
(Show Context)
Citation Context |

17 | Mixed IDEAs
- Bosman, Thierens
- 2000
(Show Context)
Citation Context ...reases some metric the most. If no addition of any arc further increases the metric, thesnal factorization graph has been found. The metric used is the Bayesian Information Criterion (for details see =-=[3]-=-). Without detailed knowledge about the functions it is not possible to tell which structure is optimal. To illustrate the potential of each model we ran a number of experiments on problems previously... |

12 |
Using Optimal Dependency Trees for Combinatorial Optimization: Learning the Structure of Search Space
- Baluja, Davies
- 1997
(Show Context)
Citation Context |

5 |
Real{valued evolutionary optimization using a probability density estimator
- Gallagher, Fream, et al.
- 1999
(Show Context)
Citation Context |

5 |
The mixture of trees factorized distribution algorithm
- Santana, Ochoa, et al.
- 2001
(Show Context)
Citation Context |