Results 1 
5 of
5
Approximating the nondominated front using the Pareto Archived Evolution Strategy
 EVOLUTIONARY COMPUTATION
, 2000
"... We introduce a simple evolution scheme for multiobjective optimization problems, called the Pareto Archived Evolution Strategy (PAES). We argue that PAES may represent the simplest possible nontrivial algorithm capable of generating diverse solutions in the Pareto optimal set. The algorithm, in its ..."
Abstract

Cited by 254 (19 self)
 Add to MetaCart
(Show Context)
We introduce a simple evolution scheme for multiobjective optimization problems, called the Pareto Archived Evolution Strategy (PAES). We argue that PAES may represent the simplest possible nontrivial algorithm capable of generating diverse solutions in the Pareto optimal set. The algorithm, in its simplest form, is a (1 + 1) evolution strategy employing local search but using a reference archive of previously found solutions in order to identify the approximate dominance ranking of the current and candidate solution vectors. (1 + 1)PAES is intended to be a baseline approach against which more involved methods may be compared. It may also serve well in some realworld applications when local search seems superior to or competitive with populationbased methods. We introduce (1 + λ) and (μ  λ) variants of PAES as extensions to the basic algorithm. Six variants of PAES are compared to variants of the Niched Pareto Genetic Algorithm and the Nondominated Sorting Genetic Algorithm over a diverse suite of six test functions. Results are analyzed and presented using techniques that reduce the attainment surfaces generated from several optimization runs into a set of univariate distributions. This allows standard statistical analysis to be carried out for comparative purposes. Our results provide strong evidence that PAES performs consistently well on a range of multiobjective optimization tasks.
Reducing local optima in singleobjective problems by multiobjectivization, in
 Proc. First International Conference on Evolutionary Multicriterion Optimization, EMO’01
, 2001
"... Abstract. One common characterization of how simple hillclimbing optimization methods can fail is that they become trapped in local optima a state where no small modi cation of the current best solution will produce a solution that is better. This measure of `better ' depends on the performan ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
(Show Context)
Abstract. One common characterization of how simple hillclimbing optimization methods can fail is that they become trapped in local optima a state where no small modi cation of the current best solution will produce a solution that is better. This measure of `better ' depends on the performance of the solution with respect to the single objective being optimized. In contrast, multiobjective optimization (MOO) involves the simultaneous optimization of a number of objectives. Accordingly, the multiobjective notion of `better ' permits consideration of solutions that may be superior in one objective but not in another. Intuitively, we maysay that this gives a hillclimber in multiobjective space more freedom to explore and less likelihood of becoming trapped. In this paper, we investigate this intuition by comparing the performance of simple hillclimberstyle algorithms on singleobjective problems and multiobjective versions of those same problems. Using an abstract buildingblock problem we illustrate how `multiobjectivizing ' a singleobjective optimization (SOO) problem can remove local optima. Then we investigate small instances of the travelling salesman problem where additional objectives are de ned using arbitrary subtours. Results indicate that multiobjectivization can reduce local optima and facilitate improved optimization in some cases. These results enlighten our intuitions about the nature of search inmultiobjective optimization and sources of diculty in singleobjective optimization. 1
A Comparison of Diverse Approaches to Memetic Multiobjective Combinatorial Optimization
 Proceedings of the Workshop on Memetic Algorithms WOMA at the Genetic and Evolutionary Computation Conference  GECCO–2000
, 2000
"... Memetic algorithms (MAs) are, at present, amongst the most successful approximate methods for combinatorial optimization. Recently, their range of application in this domain has been extended, with the introduction of several MAs for problems possessing multiple objectives. In this paper, we c ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Memetic algorithms (MAs) are, at present, amongst the most successful approximate methods for combinatorial optimization. Recently, their range of application in this domain has been extended, with the introduction of several MAs for problems possessing multiple objectives. In this paper, we consider two of the newest of these MAs, the random directions multiple objective genetic local searcher (RDMOGLS) of Jaszkiewicz, and the memetic Pareto archived evolution strategy (MPAES), recently introduced by us. The two algorithms work in different ways: MPAES employs a form of Pareto ranking in its selection mechanism, as used in several multiobjective evolutionary algorithms (MOEAs); whereas RDMOGLS uses randomly weighted utility functions to judge solution quality, drawing from multiobjective tabu search and simulated annealing approaches. These two different approaches to memetic multiobjective optimization are briefly described, and their possible strengths and...
On the Assessment of Multiobjective Approaches to the Adaptive Distributed Database Management Problem
 In M. Schoenauer et al. (Eds.), Parallel Problem Solving from Nature
, 2000
"... . In this paper we assess the performance of three modern multiobjective evolutionary algorithms on a realworld optimization problem related to the management of distributed databases. The algorithms assessed are the Strength Pareto Evolutionary Algorithm (SPEA), the Pareto Archived Evolution S ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
. In this paper we assess the performance of three modern multiobjective evolutionary algorithms on a realworld optimization problem related to the management of distributed databases. The algorithms assessed are the Strength Pareto Evolutionary Algorithm (SPEA), the Pareto Archived Evolution Strategy (PAES), and MPAES, which is a Memetic Algorithm based variant of PAES. The performance of these algorithms is compared using two distinct and sophisticated multiobjectiveperformance comparison techniques, and extensions to these comparison techniques are proposed. The information provided by the different performance assessment techniques is compared, and we find that, to some extent, the ranking of algorithm performance alters according to the comparison metric; however, it is possible to understand these differences in terms of the complex nature of multiobjective comparisons. 1 Introduction In realworld applications, obtaining the complete set of Pareto optimal solutio...
Abstract
"... We introduce a simple evolution scheme for multiobjective optimization problems, called the Pareto Archived Evolution Strategy (PAES). We argue that PAES may represent the simplest possible nontrivial algorithm capable of generating diverse solutions in the Pareto optimal set. The algorithm, in its ..."
Abstract
 Add to MetaCart
(Show Context)
We introduce a simple evolution scheme for multiobjective optimization problems, called the Pareto Archived Evolution Strategy (PAES). We argue that PAES may represent the simplest possible nontrivial algorithm capable of generating diverse solutions in the Pareto optimal set. The algorithm, in its simplest form, is a (1+1) evolution strategy employing local search but using a reference archive of previously found solutions in order to identify the approximate dominance ranking of the current and candidate solution vectors. (1+1)PAES is intended to be a baseline approach against which more involved methods may be compared. It may also serve well in some realworld applications when local search seems superior to or competitive with populationbased methods. We introduce (1+) and (+) variants of PAES as extensions to the basic algorithm. Six variants of PAES are compared to variants of the Niched Pareto Genetic Algorithm and the Nondominated Sorting Genetic Algorithm over a diverse suite of six test functions. Results are analyzed and presented using techniques that reduce the attainment surfaces generated from several optimization runs into a set of univariate distributions. This allows standard statistical analysis to be carried out for comparative purposes. Our results provide strong evidence that PAES performs consistently well on a range of multiobjective optimization tasks.