Results 1  10
of
15
Informed operators: Speeding up geneticalgorithmbased design optimization using reduced models
 In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO
, 2000
"... In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
(Show Context)
In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a point is mutated, instead of generating just one random mutation we generate several, rank them using a reduced model, then take the best to be the result of the mutation. The proposed method is particularly suitable for search spaces with expensive evaluation functions, such as arise in engineering design. Empirical results in several engineering design domains demonstrate that the proposed method can significantly speed up the GA optimizer. 1 Introduction This paper concerns the application of Genetic Algorithms (GAs) in realistic engineering design domains. In such domains a design is represented by a number of continuous design parameters, so that potential solutions are vec...
Constrained MultiObjective Optimization Using Steady State Genetic Algorithms
 In Proceedings of Genetic and Evolutionary Computation Conference
, 2003
"... In this paper we propose two novel approaches for solving constrained multiobjective optimization problems using steady state GAs. ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
In this paper we propose two novel approaches for solving constrained multiobjective optimization problems using steady state GAs.
Surrogate based Evolutionary Algorithm for Engineering Design Optimization
 Proceedings of the Eighth International Conference on Cybernetics, Informatics and Systemic (ICCIS 2005), ISBN
"... Abstract—Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require hi ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
(Show Context)
Abstract—Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of metamodels to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiplemodel based learning approach for the SVM approximator. DAFHEAII (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique. Keywords—Evolutionary algorithm, Fitness function, Optimization, Metamodel, Stochastic method.
Comparison of methods for developing dynamic reduced models for design optimization
 In Proceedings of the Congress on Evolutionary Computation (CEC’2002
, 2002
"... Abstract In this paper we compare three methods for forming reduced models to speed up geneticalgorithmbased optimization. The methods work by forming functional approximations of the fitness function which are used to speed up the GA optimization by making the genetic operators more informed. Emp ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract In this paper we compare three methods for forming reduced models to speed up geneticalgorithmbased optimization. The methods work by forming functional approximations of the fitness function which are used to speed up the GA optimization by making the genetic operators more informed. Empirical results in several engineering design domains are presented. I.
Using datamining techniques to help metaheuristics: A short survey
 Hybrid Metaheuristics, Third InternationalWorkshop, HM 2006, volume 4030 of Lecture Notes in Computer Science
, 2006
"... Abstract. Hybridizing metaheuristic approaches becomes a common way to improve the efficiency of optimization methods. Many hybridizations deal with the combination of several optimization methods. In this paper we are interested in another type of hybridization, where datamining approaches are co ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Hybridizing metaheuristic approaches becomes a common way to improve the efficiency of optimization methods. Many hybridizations deal with the combination of several optimization methods. In this paper we are interested in another type of hybridization, where datamining approaches are combined within an optimization process. Hence, we propose to study the interest of combining metaheuristics and datamining through a short survey that enumerates the different opportunities of such combinations based on literature examples. 1
Meta Model Based EA for Complex Optimization
"... Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitn ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such metamodeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of metamodels (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEAII, an enhanced version of the original DAFHEA framework, incorporates a multiplemodel based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiency
Reduced computation for evolutionary optimization in noisy environment
 In Proc. GECCO
, 2008
"... ABSTRACT Evolutionary Algorithms' (EAs') application to real world optimization problems often involves expensive fitness function evaluation. Naturally this has a crippling effect on the performance of any population based search technique such as EA. Estimating the fitness of individual ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
ABSTRACT Evolutionary Algorithms' (EAs') application to real world optimization problems often involves expensive fitness function evaluation. Naturally this has a crippling effect on the performance of any population based search technique such as EA. Estimating the fitness of individuals instead of actually evaluating them is a workable approach to deal with this situation. Optimization problems in real world often involve expensive fitness. In Categories and Subject Descriptors INTRODUCTION Many real world optimization problems involve very expensive function evaluation, making it impractical for a population based search technique such as EA to be used in such problem domains. In such problems, the runtime for a single function evaluation could be in the range from a fraction of a second to hours of supercomputer time. A suitable alternative is to use approximation instead of actual function evaluation to substantially reduce the computation time [8, 10, and 11]. Use of approximate model to speed up optimization dates all the way back to the sixties The DAFHEA (dynamic approximate fitness based hybrid evolutionary algorithm) framework proposed in our earlier research
Expensive Optimization, Uncertain Environment: An EABased Solution
"... Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of any population based iterative technique such as evolutionary algorithm in such problem domains is thu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of any population based iterative technique such as evolutionary algorithm in such problem domains is thus practically prohibitive. A feasible alternative is to build surrogates or use an approximation of the actual fitness functions to be evaluated. Naturally these surrogate or meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. This paper presents two evolutionary algorithm frameworks which involve surrogate based fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [1] reduces computation time by controlled use of metamodels (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account problem domains involving uncertain environment. The second model, DAFHEAII, an enhanced version of the original DAFHEA framework, incorporates a multiplemodel based learning approach for the support vector machine approximator to handle uncertain environment [2]. Empirical evaluation results have been presented based on application of the frameworks to commonly used benchmark functions.