Results 1  10
of
239
Covariance Matrix Adaptation for Multiobjective Optimization
 Evolutionary Computation
"... The covariance matrix adaptation evolution strategy (CMAES) is one of the most powerful evolutionary algorithms for realvalued singleobjective optimization. In this paper, we develop a variant of the CMAES for multiobjective optimization (MOO). We first introduce a singleobjective, elitist C ..."
Abstract

Cited by 113 (13 self)
 Add to MetaCart
(Show Context)
The covariance matrix adaptation evolution strategy (CMAES) is one of the most powerful evolutionary algorithms for realvalued singleobjective optimization. In this paper, we develop a variant of the CMAES for multiobjective optimization (MOO). We first introduce a singleobjective, elitist CMAES using plusselection and step size control based on a success rule. This algorithm is compared to the standard CMAES. The elitist CMAES turns out to be slightly faster on unimodal functions, but is more prone to getting stuck in suboptimal local minima. In the new multiobjective CMAES (MOCMAES) a population of individuals that adapt their search strategy as in the elitist CMAES is maintained. These are subject to multiobjective selection. The selection is based on nondominated sorting using either the crowdingdistance or the contributing hypervolume as second sorting criterion. Both the elitist singleobjective CMAES and the MOCMAES inherit important invariance properties, in particular invariance against rotation of the search space, from the original CMAES. The benefits of the new MOCMAES in comparison to the wellknown NSGAII and to NSDE, a multiobjective differential evolution algorithm, are experimentally shown.
The CMA Evolution Strategy: A Comparing Review
 STUDFUZZ
, 2006
"... Derived from the concept of selfadaptation in evolution strategies, the CMA (Covariance Matrix Adaptation) adapts the covariance matrix of a multivariate normal search distribution. The CMA was originally designed to perform well with small populations. In this review, the argument starts out with ..."
Abstract

Cited by 101 (29 self)
 Add to MetaCart
Derived from the concept of selfadaptation in evolution strategies, the CMA (Covariance Matrix Adaptation) adapts the covariance matrix of a multivariate normal search distribution. The CMA was originally designed to perform well with small populations. In this review, the argument starts out with large population sizes, reflecting recent extensions of the CMA algorithm. Commonalities and differences to continuous Estimation of Distribution Algorithms are analyzed. The aspects of reliability of the estimation, overall step size control, and independence from the coordinate system (invariance) become particularly important in small populations sizes. Consequently, performing the adaptation task with small populations is more intricate.
Autonomous selfassembly in swarmbots
 IEEE Trans. Robot
, 2006
"... Summary. Multirobot systems have been studied in tasks that require the robots to be physically linked. In such a configuration, a group of robots may navigate a terrain that proves too difficult for a single robot. On the contrary, many collective tasks can be accomplished more efficiently by a gr ..."
Abstract

Cited by 95 (40 self)
 Add to MetaCart
(Show Context)
Summary. Multirobot systems have been studied in tasks that require the robots to be physically linked. In such a configuration, a group of robots may navigate a terrain that proves too difficult for a single robot. On the contrary, many collective tasks can be accomplished more efficiently by a group of independent robots. This paper is about swarmbot, a robotic system that can operate in both configurations and autonomously switch from one to the other. We examine the performance of a single robot and of groups of robots selfassembling with an object or another robot. We assess the robustness of the system with respect to different types of rough terrain. Finally, we evaluate the performance of swarms of 16 physical robots. At present, for selfassembly in autonomous, mobile robotics, swarmhots is the state of the art for what concerns reliability, robustness and speed.
On the computation of all global minimizers through particle swarm optimization
 IEEE Transactions on Evolutionary Computation
, 2004
"... Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimi ..."
Abstract

Cited by 79 (18 self)
 Add to MetaCart
Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimizer. The aforementioned techniques are incorporated in the context of the particle swarm optimization (PSO) method, resulting in an efficient algorithm which has the ability to avoid previously detected solutions and, thus, detect all global minimizers of a function. Experimental results on benchmark problems originating from the fields of global optimization, dynamical systems, and game theory, are reported, and conclusions are derived. Index Terms—Deflection technique, detecting all minimizers, dynamical systems, Nash equilibria, particle swarm optimization (PSO), periodic orbits, stretching technique. I.
Evolutionary tuning of multiple svm parameters
 In Proc. of the 12th European Symposium on Artificial Neural Networks (ESANN 2004
, 2004
"... The problem of model selection for support vector machines (SVMs) is considered. We propose an evolutionary approach to determine multiple SVM hyperparameters: The covariance matrix adaptation evolution strategy (CMAES) is used to determine the kernel from a parameterized kernel space and to contro ..."
Abstract

Cited by 74 (5 self)
 Add to MetaCart
(Show Context)
The problem of model selection for support vector machines (SVMs) is considered. We propose an evolutionary approach to determine multiple SVM hyperparameters: The covariance matrix adaptation evolution strategy (CMAES) is used to determine the kernel from a parameterized kernel space and to control the regularization. Our method is applicable to optimize nondifferentiable kernel functions and arbitrary model selection criteria. We demonstrate on benchmark datasets that the CMAES improves the results achieved by grid search already when applied to few hyperparameters. Further, we show that the CMAES is able to handle much more kernel parameters compared to gridsearch and that tuning of the scaling and the rotation of Gaussian kernels can lead to better results in comparison to standard Gaussian kernels with a single bandwidth parameter. In particular, more flexibility of the kernel can reduce the number of support vectors. Key words: support vector machines, model selection, evolutionary algorithms 1
Adaptation in Evolutionary Computation: A Survey
 In Proceedings of the Fourth International Conference on Evolutionary Computation (ICEC 97
, 1997
"... Abstract � Adaptation of parameters and operators is one of the most important and promising areas of research in evolutionary computation � it tunes the algorithm to the problem while solving the problem. In this paper we develop a classi�cation of adaptation on the basis of the mechanisms used � a ..."
Abstract

Cited by 66 (6 self)
 Add to MetaCart
Abstract � Adaptation of parameters and operators is one of the most important and promising areas of research in evolutionary computation � it tunes the algorithm to the problem while solving the problem. In this paper we develop a classi�cation of adaptation on the basis of the mechanisms used � and the level at which adaptation operates within the evolutionary algorithm. The classi�cation covers all forms of adaptation in evolutionary computation and suggests fur� ther research. I.
Towards a Theory of `Evolution Strategies'. Some Asymptotical Results from the (1,+lambda)Theory
 Evolutionary Computation
, 1993
"... A method for the determination of the progress rate and the probability of success for the `Evolution Strategy' (short ES) is presented . The new method is based on the asymptotical behavior of the distribution and yields exact results in case of infinitedimensional parameter spaces. The tech ..."
Abstract

Cited by 55 (23 self)
 Add to MetaCart
(Show Context)
A method for the determination of the progress rate and the probability of success for the `Evolution Strategy' (short ES) is presented . The new method is based on the asymptotical behavior of the distribution and yields exact results in case of infinitedimensional parameter spaces. The technique is demonstrated for the (1; + ) ES using a spherical model including noisy quality functions. The results are used to discuss the convergence behavior of the ES. Keywords Evolution Strategy (ES)  (1; + )  sperical model  noisy fitness  theory  rate of progress optimization mutationselectionprinciple 1 1 Introduction For more than 25 years the Evolution Strategy (ES) has been living as a method of growing interest for technical and numerical optimization problems [1, 2, 3, 4]. Simple as its algorithm is (cf. section 1. 1) the mathematical analysis of the ES, however, seems to be a very hard task and up to now there are only a few exact results. E. g., the analysis of the opti...
Natural Evolution Strategies
"... Abstract — This paper presents Natural Evolution Strategies (NES), a novel algorithm for performing realvalued ‘black box ’ function optimization: optimizing an unknown objective function where algorithmselected function measurements constitute the only information accessible to the method. Natura ..."
Abstract

Cited by 41 (22 self)
 Add to MetaCart
(Show Context)
Abstract — This paper presents Natural Evolution Strategies (NES), a novel algorithm for performing realvalued ‘black box ’ function optimization: optimizing an unknown objective function where algorithmselected function measurements constitute the only information accessible to the method. Natural Evolution Strategies search the fitness landscape using a multivariate normal distribution with a selfadapting mutation matrix to generate correlated mutations in promising regions. NES shares this property with Covariance Matrix Adaption (CMA), an Evolution Strategy (ES) which has been shown to perform well on a variety of highprecision optimization tasks. The Natural Evolution Strategies algorithm, however, is simpler, less adhoc and more principled. Selfadaptation of the mutation matrix is derived using a Monte Carlo estimate of the natural gradient towards better expected fitness. By following the natural gradient instead of the ‘vanilla ’ gradient, we can ensure efficient update steps while preventing early convergence due to overly greedy updates, resulting in reduced sensitivity to local suboptima. We show NES has competitive performance with CMA on unimodal tasks, while outperforming it on several multimodal tasks that are rich in deceptive local optima. I.
Noisy optimization with evolution strategies
 SIAM Journal on Optimization
"... Evolution strategies are general, natureinspired heuristics for search and optimization. Supported both by empirical evidence and by recent theoretical findings, there is a common belief that evolution strategies are robust and reliable, and frequently they are the method of choice if neither deriv ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
(Show Context)
Evolution strategies are general, natureinspired heuristics for search and optimization. Supported both by empirical evidence and by recent theoretical findings, there is a common belief that evolution strategies are robust and reliable, and frequently they are the method of choice if neither derivatives of the objective function are at hand nor differentiability and numerical accuracy can be assumed. However, despite their widespread use, there is little exchange between members of the “classical ” optimization community and people working in the field of evolutionary computation. It is our belief that both sides would benefit from such an exchange. In this paper, we present a brief outline of evolution strategies and discuss some of their properties in the presence of noise. We then empirically demonstrate that for a simple but nonetheless nontrivial noisy objective function, an evolution strategy outperforms other optimization algorithms designed to be able to cope with noise. The environment in which the algorithms are tested is deliberately chosen to afford a transparency of the results that reveals the strengths and shortcomings of the strategies, making it possible to draw conclusions with regard to the design of better optimization algorithms for noisy environments. 1