Results 1  10
of
14
Informed operators: Speeding up geneticalgorithmbased design optimization using reduced models
 In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO
, 2000
"... In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a ..."
Abstract

Cited by 18 (8 self)
 Add to MetaCart
(Show Context)
In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a point is mutated, instead of generating just one random mutation we generate several, rank them using a reduced model, then take the best to be the result of the mutation. The proposed method is particularly suitable for search spaces with expensive evaluation functions, such as arise in engineering design. Empirical results in several engineering design domains demonstrate that the proposed method can significantly speed up the GA optimizer. 1 Introduction This paper concerns the application of Genetic Algorithms (GAs) in realistic engineering design domains. In such domains a design is represented by a number of continuous design parameters, so that potential solutions are vec...
Constrained MultiObjective Optimization Using Steady State Genetic Algorithms
 In Proceedings of Genetic and Evolutionary Computation Conference
, 2003
"... In this paper we propose two novel approaches for solving constrained multiobjective optimization problems using steady state GAs. ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
In this paper we propose two novel approaches for solving constrained multiobjective optimization problems using steady state GAs.
Surrogate based Evolutionary Algorithm for Engineering Design Optimization
 Proceedings of the Eighth International Conference on Cybernetics, Informatics and Systemic (ICCIS 2005), ISBN
"... Abstract—Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require hi ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of metamodels to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiplemodel based learning approach for the SVM approximator. DAFHEAII (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique. Keywords—Evolutionary algorithm, Fitness function, Optimization, Metamodel, Stochastic method.
Comparison of methods for developing dynamic reduced models for design optimization
 In Proceedings of the Congress on Evolutionary Computation (CEC’2002
, 2002
"... Abstract In this paper we compare three methods for forming reduced models to speed up geneticalgorithmbased optimization. The methods work by forming functional approximations of the fitness function which are used to speed up the GA optimization by making the genetic operators more informed. Emp ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract In this paper we compare three methods for forming reduced models to speed up geneticalgorithmbased optimization. The methods work by forming functional approximations of the fitness function which are used to speed up the GA optimization by making the genetic operators more informed. Empirical results in several engineering design domains are presented. I.
Meta Model Based EA for Complex Optimization
"... Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitn ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Evolutionary Algorithms are populationbased, stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such metamodeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of metamodels (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEAII, an enhanced version of the original DAFHEA framework, incorporates a multiplemodel based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiency
Expensive Optimization, Uncertain Environment: An EABased Solution
"... Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of any population based iterative technique such as evolutionary algorithm in such problem domains is thu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of any population based iterative technique such as evolutionary algorithm in such problem domains is thus practically prohibitive. A feasible alternative is to build surrogates or use an approximation of the actual fitness functions to be evaluated. Naturally these surrogate or meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. This paper presents two evolutionary algorithm frameworks which involve surrogate based fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [1] reduces computation time by controlled use of metamodels (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account problem domains involving uncertain environment. The second model, DAFHEAII, an enhanced version of the original DAFHEA framework, incorporates a multiplemodel based learning approach for the support vector machine approximator to handle uncertain environment [2]. Empirical evaluation results have been presented based on application of the frameworks to commonly used benchmark functions.
International Journal of Computational Intelligence 4;1 2008 Meta Model Based EA for Complex Optimization
"... stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary a ..."
Abstract
 Add to MetaCart
(Show Context)
stochastic search techniques, widely used as efficient global optimizers. However, many real life optimization problems often require finding optimal solution to complex high dimensional, multimodal problems involving computationally very expensive fitness function evaluations. Use of evolutionary algorithms in such problem domains is thus practically prohibitive. An attractive alternative is to build meta models or use an approximation of the actual fitness functions to be evaluated. These meta models are order of magnitude cheaper to evaluate compared to the actual function evaluation. Many regression and interpolation tools are available to build such meta models. This paper briefly discusses the architectures and use of such metamodeling tools in an evolutionary optimization context. We further present two evolutionary algorithm frameworks which involve use of meta models for fitness function evaluation. The first framework, namely the Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model [14] reduces computation time by controlled use of metamodels (in this case approximate model generated by Support Vector Machine regression) to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the metamodel are generated from a single uniform model. This does not take into account uncertain scenarios involving noisy fitness functions. The second model, DAFHEAII, an enhanced version of the original DAFHEA framework, incorporates a multiplemodel based learning approach for the support vector machine approximator to handle noisy functions [15]. Empirical results obtained by evaluating the frameworks using several benchmark functions demonstrate their efficiency
Phone:(732)4454379
"... In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a poin ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper we describe a method for improving geneticalgorithmbased optimization using informed genetic operators. The idea is to make the genetic operators such as mutation and crossover more informed using reduced models. In every place where a random choice is made, for example when a point is mutated, instead of generating just one random mutation we generate several, rank them using a reduced model, then take the best to be the result of the mutation. The proposed method is particularly suitable for search spaces with expensive evaluation functions, such as arise in engineering design. Empirical results in several engineering design domains demonstrate that the proposed method can significantly speed up the GA optimizer. 1