Results 1  10
of
31
Combining convergence and diversity in evolutionary multiobjective optimization
 Evolutionary Computation
, 2002
"... Over the past few years, the research on evolutionary algorithms has demonstrated their niche in solving multiobjective optimization problems, where the goal is to �nd a number of Paretooptimal solutions in a single simulation run. Many studies have depicted different ways evolutionary algorithms c ..."
Abstract

Cited by 113 (11 self)
 Add to MetaCart
Over the past few years, the research on evolutionary algorithms has demonstrated their niche in solving multiobjective optimization problems, where the goal is to �nd a number of Paretooptimal solutions in a single simulation run. Many studies have depicted different ways evolutionary algorithms can progress towards the Paretooptimal set with a widely spread distribution of solutions. However, none of the multiobjective evolutionary algorithms (MOEAs) has a proof of convergence to the true Paretooptimal solutions with a wide diversity among the solutions. In this paper, we discuss why a number of earlier MOEAs do not have such properties. Based on the concept ofdominance, new archiving strategies are proposed that overcome this fundamental problem and provably lead to MOEAs that have both the desired convergence and distribution properties. A number of modi�cations to the baseline algorithm are also suggested. The concept ofdominance introduced in this paper is practical and should make the proposed algorithms useful to researchers and practitioners alike.
Indicatorbased selection in multiobjective search
 in Proc. 8th International Conference on Parallel Problem Solving from Nature (PPSN VIII
, 2004
"... Abstract. This paper discusses how preference information of the decision maker can in general be integrated into multiobjective search. The main idea is to first define the optimization goal in terms of a binary performance measure (indicator) and then to directly use this measure in the selection ..."
Abstract

Cited by 88 (6 self)
 Add to MetaCart
Abstract. This paper discusses how preference information of the decision maker can in general be integrated into multiobjective search. The main idea is to first define the optimization goal in terms of a binary performance measure (indicator) and then to directly use this measure in the selection process. To this end, we propose a general indicatorbased evolutionary algorithm (IBEA) that can be combined with arbitrary indicators. In contrast to existing algorithms, IBEA can be adapted to the preferences of the user and moreover does not require any additional diversity preservation mechanism such as fitness sharing to be used. It is shown on several continuous and discrete benchmark problems that IBEA can substantially improve on the results generated by two popular algorithms, namely NSGAII and SPEA2, with respect to different performance measures. 1
Approximating the volume of unions and intersections of highdimensional geometric objects
, 2008
"... ..."
Faster SMetric Calculation by Considering Dominated Hypervolume as Klee’s Measure Problem
, 2006
"... The dominated hypervolume (or Smetric) is a commonly accepted quality measure for comparing approximations of Pareto fronts generated by multiobjective optimizers. Since optimizers exist, namely evolutionary algorithms, that use the Smetric internally several times per iteration, a faster determi ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
The dominated hypervolume (or Smetric) is a commonly accepted quality measure for comparing approximations of Pareto fronts generated by multiobjective optimizers. Since optimizers exist, namely evolutionary algorithms, that use the Smetric internally several times per iteration, a faster determination of the Smetric value is of essential importance. This paper describes how to consider the Smetric as a special case of a more general geometrical problem called Klee’s measure problem (KMP). For KMP, an algorithm exists with run time O(n logn + n d/2 log n), for n points of d ≥ 3 dimensions. This complex algorithm is adapted to the special case of calculating the Smetric. Conceptual simplifications of the implementation are concerned that save on a factor of O(logn) and establish an upper bound of O(n logn + n d/2) for the Smetric calculation, improving the previously known bound of O(n d−1).
Bounded Archiving using the Lebesgue Measure
, 2003
"... Many modern multiobjective evolutionary algorithms (MOEAs) store the points discovered during optimization in an external archive, separate from the main population, as a source of innovation and/or for presentation at the end of a run. Maintaining a bound on the size of the archive may be desirable ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Many modern multiobjective evolutionary algorithms (MOEAs) store the points discovered during optimization in an external archive, separate from the main population, as a source of innovation and/or for presentation at the end of a run. Maintaining a bound on the size of the archive may be desirable or necessary for several reasons, but choosing which points to discard and which to keep in the archive, as they are discovered, is not trivial. In this paper we briefly review the stateoftheart in bounded archiving, and present a new method based on locally maximizing the hypervolume dominated by the archive. The new archiver is shown to outperform existing methods, on several problem instances, with respect to the quality of the archive obtained when judged using three distinct quality measures.
Archiving with Guaranteed Convergence and Diversity in MultiObjective Optimization
 In Proceedings of the Genetic and Evolutionary Computation Conference
, 2002
"... Over the past few years, the research on evolutionary algorithms has demonstrated their niche in solving multiobjective optimization problems, where the goal is to find a number of Paretooptimal solutions in a single simulation run. However, none of the multiobjective evolutionary algorithm ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Over the past few years, the research on evolutionary algorithms has demonstrated their niche in solving multiobjective optimization problems, where the goal is to find a number of Paretooptimal solutions in a single simulation run. However, none of the multiobjective evolutionary algorithms (MOEAs) has a proof of convergence to the true Paretooptimal solutions with a wide diversity among the solutions. In this paper we discuss why a number of earlier MOEAs do not have such properties. A new archiving strategy is proposed that maintains a subset of the generated solutions. It guarantees convergence and diversity according to welldefined criteria, i.e. #dominance and #Pareto optimality.
Approximating the least hypervolume contributor: NPhard in general, but fast in practice
, 2008
"... ..."
Multiplicative Approximations and the Hypervolume Indicator
"... Indicatorbased algorithms have become a very popular approach to solve multiobjective optimization problems. In this paper, we contribute to the theoretical understanding of algorithms maximizing the hypervolume for a given problem by distributing µ points on the Pareto front. We examine this comm ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Indicatorbased algorithms have become a very popular approach to solve multiobjective optimization problems. In this paper, we contribute to the theoretical understanding of algorithms maximizing the hypervolume for a given problem by distributing µ points on the Pareto front. We examine this common approach with respect to the achieved multiplicative approximation ratio for a given multiobjective problem and relate it to a set of µ points on the Pareto front that achieves the best possible approximation ratio. For the class of linear fronts and a class of concave fronts, we prove that the hypervolume gives the best possible approximation ratio. In addition, we examine Pareto fronts of different shapes by numerical calculations and show that the approximation computed by the hypervolume may differ from the optimal approximation ratio.
No Free Lunch and Free Leftovers Theorems for Multiobjective Optimisation Problems
 Evolutionary MultiCriterion Optimization (EMO 2003) Second International Conference
, 2003
"... The classic NFL theorems are invariably cast in terms of single objective optimization problems. We confirm that the classic NFL theorem holds for general multiobjective fitness spaces, and show how this follows from a 'singleobjective' NFL theorem. We also show that, given any particular Pareto Fr ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
The classic NFL theorems are invariably cast in terms of single objective optimization problems. We confirm that the classic NFL theorem holds for general multiobjective fitness spaces, and show how this follows from a 'singleobjective' NFL theorem. We also show that, given any particular Pareto Front, an NFL theorem holds for the set of all multiobjective problems which have that Pareto Front. It follows that, given any 'shape' or class of Pareto fronts, an NFL theorem holds for the set of all multiobjective problems in that class. These findings have salience in test function design. Such NFL results are cast in the typical context of absolute performance, assuming a performance metric which returns a value based on the result produced by a single algorithm. But, in multiobjective search...
On the complexity of computing the hypervolume indicator
 IEEE Trans. Evolutionary Computation
"... Abstract—The goal of multiobjective optimization is to find a set of best compromise solutions for typically conflicting objectives. Due to the complex nature of most reallife problems, only an approximation to such an optimal set can be obtained within reasonable (computing) time. To compare such ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract—The goal of multiobjective optimization is to find a set of best compromise solutions for typically conflicting objectives. Due to the complex nature of most reallife problems, only an approximation to such an optimal set can be obtained within reasonable (computing) time. To compare such approximations, and thereby the performance of multiobjective optimizers providing them, unary quality measures are usually applied. Among these, the hypervolume indicator (or Smetric) is of particular relevance due to its good properties. Moreover, this indicator has been successfully integrated into stochastic optimizers, such as evolutionary algorithms, where it serves as a guidance criterion for searching the parameter space. Recent results show that computing the hypervolume indicator can be seen as solving a specialized version