Results 1  10
of
106
MultiObjective Genetic Algorithms: Problem Difficulties and Construction of Test Problems
 Evolutionary Computation
, 1999
"... In this paper, we study the problem features that may cause a multiobjective genetic algorithm (GA) difficulty in converging to the true Paretooptimal front. Identification of such features helps us develop difficult test problems for multiobjective optimization. Multiobjective test problems ..."
Abstract

Cited by 167 (12 self)
 Add to MetaCart
(Show Context)
In this paper, we study the problem features that may cause a multiobjective genetic algorithm (GA) difficulty in converging to the true Paretooptimal front. Identification of such features helps us develop difficult test problems for multiobjective optimization. Multiobjective test problems are constructed from singleobjective optimization problems, thereby allowing known difficult features of singleobjective problems (such as multimodality, isolation, or deception) to be directly transferred to the corresponding multiobjective problem. In addition, test problems having features specific to multiobjective optimization are also constructed. More importantly, these difficult test problems will enable researchers to test their algorithms for specific aspects of multiobjective optimization. Keywords Genetic algorithms, multiobjective optimization, niching, paretooptimality, problem difficulties, test problems. 1 Introduction After a decade since the pioneering wor...
An Efficient Constraint Handling Method for Genetic Algorithms
 Computer Methods in Applied Mechanics and Engineering
, 1998
"... Many realworld search and optimization problems involve inequality and/or equality constraints and are thus posed as constrained optimization problems. In trying to solve constrained optimization problems using genetic algorithms (GAs) or classical optimization methods, penalty function methods hav ..."
Abstract

Cited by 137 (12 self)
 Add to MetaCart
(Show Context)
Many realworld search and optimization problems involve inequality and/or equality constraints and are thus posed as constrained optimization problems. In trying to solve constrained optimization problems using genetic algorithms (GAs) or classical optimization methods, penalty function methods have been the most popular approach, because of their simplicity and ease of implementation. However, since the penalty function approach is generic and applicable to any type of constraint (linear or nonlinear), their performance is not always satisfactory. Thus, researchers have developed sophisticated penalty functions specific to the problem at hand and the search algorithm used for optimization. However, the most difficult aspect of the penalty function approach is to find appropriate penalty parameters needed to guide the search towards the constrained optimum. In this paper, GA's populationbased approach and ability to make pairwise comparison in tournament selection operator are explo...
An Indexed Bibliography of Genetic Algorithms in Power Engineering
, 1995
"... s: Jan. 1992  Dec. 1994 ffl CTI: Current Technology Index Jan./Feb. 1993  Jan./Feb. 1994 ffl DAI: Dissertation Abstracts International: Vol. 53 No. 1  Vol. 55 No. 4 (1994) ffl EEA: Electrical & Electronics Abstracts: Jan. 1991  Dec. 1994 ffl P: Index to Scientific & Technical Proceed ..."
Abstract

Cited by 79 (10 self)
 Add to MetaCart
s: Jan. 1992  Dec. 1994 ffl CTI: Current Technology Index Jan./Feb. 1993  Jan./Feb. 1994 ffl DAI: Dissertation Abstracts International: Vol. 53 No. 1  Vol. 55 No. 4 (1994) ffl EEA: Electrical & Electronics Abstracts: Jan. 1991  Dec. 1994 ffl P: Index to Scientific & Technical Proceedings: Jan. 1986  Feb. 1995 (except Nov. 1994) ffl EI A: The Engineering Index Annual: 1987  1992 ffl EI M: The Engineering Index Monthly: Jan. 1993  Dec. 1994 The following GA researchers have already kindly supplied their complete autobibliographies and/or proofread references to their papers: Dan Adler, Patrick Argos, Jarmo T. Alander, James E. Baker, Wolfgang Banzhaf, Ralf Bruns, I. L. Bukatova, Thomas Back, Yuval Davidor, Dipankar Dasgupta, Marco Dorigo, Bogdan Filipic, Terence C. Fogarty, David B. Fogel, Toshio Fukuda, Hugo de Garis, Robert C. Glen, David E. Goldberg, Martina GorgesSchleuter, Jeffrey Horn, Aristides T. Hatjimihail, Mark J. Jakiela, Richard S. Judson, Akihiko Konaga...
Evolutionary Algorithms for MultiCriterion Optimization in Engineering Design
, 1999
"... this paper, we briefly outline the principles of multiobjective optimization. Thereafter, we discuss why classical search and optimization methods are not adequate for multicriterion optimization by discussing the working of two popular methods. We then outline several evolutionary methods for han ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
this paper, we briefly outline the principles of multiobjective optimization. Thereafter, we discuss why classical search and optimization methods are not adequate for multicriterion optimization by discussing the working of two popular methods. We then outline several evolutionary methods for handling multicriterion optimization problems. Of them, we discuss one implementation (nondominated sorting GA or NSGA [38]) in somewhat greater details. Thereafter, we demonstrate the working of the evolutionary methods by applying NSGA to three test problems having constraints and discontinuous Paretooptimal region. We also show the efficacy of evolutionary algorithms in engineering design problems by solving a welded beam design problem. The results show that evolutionary methods can find widely different yet nearParetooptimal solutions in such problems. Based on the above studies, this paper also suggests a number of immediate future studies which would make this emerging field more mature and applicable in practice. 1.2 PRINCIPLES OF MULTICRITERION OPTIMIZATION
A Combined Genetic Adaptive Search (GeneAS) for Engineering Design
 Computer Science and Informatics
, 1996
"... In this paper, a flexible yet efficient algorithm for solving engineering design optimization problems is presented. The algorithm is developed based on both binarycoded and realcoded genetic algorithms (GAs). Since both GAs are used, the variables involving discrete, continuous, and zeroone varia ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
(Show Context)
In this paper, a flexible yet efficient algorithm for solving engineering design optimization problems is presented. The algorithm is developed based on both binarycoded and realcoded genetic algorithms (GAs). Since both GAs are used, the variables involving discrete, continuous, and zeroone variables are handled quite efficiently. The algorithm restricts its search only to the permissible values of the variables, thereby reducing the search effort in converging to the optimum solution. The efficiency and ease of application of the proposed method is demonstrated by solving three different mechanical component design problems borrowed from the optimization literature. The proposed technique is compared with binarycoded genetic algorithms, Augmented Lagrange multiplier method, Branch and Bound method and Hooke and Jeeves pattern search method. In all cases, the solutions obtained using the proposed technique are superior than those obtained with other methods. These results are encour...
Understanding interactions among genetic algorithm parameters
 in Foundations of Genetic Algorithms 5
, 1999
"... Genetic algorithms (GAs) are multidimensional and stochastic search methods, involving complex interactions among their parameters. For last two decades, researchers have been trying to understand the mechanics of GA parameter interactions by using various techniquescareful `functional ' deco ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
(Show Context)
Genetic algorithms (GAs) are multidimensional and stochastic search methods, involving complex interactions among their parameters. For last two decades, researchers have been trying to understand the mechanics of GA parameter interactions by using various techniquescareful `functional ' decomposition of parameter interactions, empirical studies, and Markov chain analysis. Although the complexities in these interactions are getting clearer with such analyses, it still remains an open question in the mind of a newcomer to the eld or to a GApractitioner as to what values of GA parameters (such as population size, choice of GA operators, operator probabilities, and others) to use in an arbitrary problem. In this paper, we investigate the performance of simple tripartite GAs on a number of simple to complex test problems from a practical standpoint. Since in a realworld situation, the overall time to run a GA is more or less dominated by the time consumed by objective function evaluations, we compare di erent GAs for a xed number of function evaluations. Based on probability calculations and simulation results, it is observed that for solving simple problems (unimodal or small modality problems) the mutation operator plays an important role, although GAs with the crossover operator alone can also solve these problems. However, the two operators (when applied alone) have two di erent working zones for the population size. For complex problems involving massive multimodality and misleadingness (deception), the crossover operator is the key search operator. Based on these studies, it is recommended that when in doubt, the use of the crossover operator with an adequate population size is a reliable approach.
Nonlinear goal programming using multiobjective genetic algorithms
 Journal of the Operational Research Society
, 2001
"... Goal programming is a technique often used in engineering design activities primarily to find a compromised solution which will simultaneously satisfy a number of design goals. In solving goal programming problems, classical methods reduce the multiple goalattainment problem into a single objecti ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Goal programming is a technique often used in engineering design activities primarily to find a compromised solution which will simultaneously satisfy a number of design goals. In solving goal programming problems, classical methods reduce the multiple goalattainment problem into a single objective of minimizing a weighted sum of deviations from goals. Moreover, in tackling nonlinear goal programming problems, classical methods use successive linearization techniques, which are sensitive to the chosen starting solution. In this paper, we pose the goal programming problem as a multiobjective optimization problem of minimizing deviations from individual goals. This procedure eliminates the need of having extra constraints needed with classical formulations and also eliminates the need of any userdefined weight factor for each goal. The proposed technique can also solve goal programming problems having nonconvex tradeoff region, which are difficult to solve using classical methods. The efficacy of the proposed method is demonstrated by solving a number of nonlinear test problems and by solving an engineering design problem. The results suggest that the proposed approach is an unique, effective, and most practical tool for solving goal programming problems.
Testcase Generator for Nonlinear Continuous Parameter Optimization Techniques
 IEEE Transactions on Evolutionary Computation
, 2000
"... The experimental results reported in many papers suggest that making an appropriate a priori choice of an evolutionary method for a nonlinear parameter optimization problem remains an open question. It seems that the most promising approach at this stage of research is experimental# involving a desi ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
The experimental results reported in many papers suggest that making an appropriate a priori choice of an evolutionary method for a nonlinear parameter optimization problem remains an open question. It seems that the most promising approach at this stage of research is experimental# involving a design of a scalable test suite of constrained optimization problems# in which many features could be easily tuned. Then it would be possible to evaluate merits and drawbacks of the available methods as well as test new methods e#ciently. In this paper we propose such a test#case generator for constrained parameter optimization tech# niques. This generator is capable of creating various test problems with di#erent characteristics# like #1# problems with di#erent relative size of the feasible region in the search space# #2# problems with di#erent number and types of constraints# #3# problems with convex or non#convex objective function# possibly with multiple optima# #4# problems with highly non#...
Differential Evolution for the Optimal Design of Heat Exchangers
"... This paper presents the application of Differential Evolution (DE) for the optimal design of shellandtube heat exchangers. A primary objective in the heat exchanger (HE) design is the estimation of the minimum heat transfer area required for a given heat duty, as it governs the overall cost of t ..."
Abstract

Cited by 13 (10 self)
 Add to MetaCart
This paper presents the application of Differential Evolution (DE) for the optimal design of shellandtube heat exchangers. A primary objective in the heat exchanger (HE) design is the estimation of the minimum heat transfer area required for a given heat duty, as it governs the overall cost of the heat exchanger. However, many number of discrete combinations of the design variables are possible. Hence the design engineer needs an efficient strategy in searching for the global minimum heat exchanger cost. In the present study, for the first time DE, an improved version of Genetic Algorithms (GA), has been successfully applied with 1,61,280 design configurations obtained by varying the design variables: tube outer diameter, tube pitch, tube length, number of tube passes, baffle spacing and baffle cut. Bells method is used to find the heat transfer area for a given design configuration. For a case study taken up, it is observed that DE, an exceptionally simple evolution strategy, is significantly faster compared to GA and is also much more likely to find a functions true global optimum.
Optimizing Engineering Designs Using a Combined Genetic Search
 Proceedings of the Sixth International Conference on Genetic Algorithms
, 1995
"... In the optimization of engineering designs, traditional search and optimization methods face at least two difficulties: (i) since each is specialized in solving a particular type of problem, one method does not work well on different types of problems (ii) most of them are designed to work on contin ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
In the optimization of engineering designs, traditional search and optimization methods face at least two difficulties: (i) since each is specialized in solving a particular type of problem, one method does not work well on different types of problems (ii) most of them are designed to work on continuous search spaces. Since different optimal engineering design problems give rise to objective and constraint functions of varying degree of nonlinearity and since most engineering design problems involve mixed variables (zeroone, discrete, and continuous), designers often face difficulty in using the traditional methods. In this paper, a combined genetic search technique (GeneAS) is suggested to solve mixedinteger programming problems often encountered in engineering design activities. GeneAS uses a combination of binarycoded and realcoded GAs to handle different types of variables. In handling discrete variables, GeneAS restricts its search only to the permissible values of the variabl...