Results 1 
9 of
9
Implicit Niching in a Learning Classifier System: Nature's Way
 EVOLUTIONARY COMPUTATION
, 1994
"... We approach the difficult task of analyzing the complex behavior of even the simplest learning classifier system (LCS) by isolating one crucial subfunction in the LCS learning algorithm: covering through niching. The LCS must maintain a population of diverse rules that together solve a problem (e.g. ..."
Abstract

Cited by 56 (9 self)
 Add to MetaCart
We approach the difficult task of analyzing the complex behavior of even the simplest learning classifier system (LCS) by isolating one crucial subfunction in the LCS learning algorithm: covering through niching. The LCS must maintain a population of diverse rules that together solve a problem (e.g., classify examples). To maintain a diverse population while applying the GA's selection operator, the LCS must incorporate some kind of niching mechanism. The natural way to accomplish niching in an LCS is to force competing rules to share resources (i.e., rewards). This implicit LCS fitness sharing is similar to the explicit fitness sharing used in many niched GAs. Indeed, the LCS implicit sharing algorithm can be mapped onto explicit fitness sharing with a onetoone correspondence between algorithm components. This mapping is important because several studies of explicit fitness sharing, and of niching in GAs generally, have produced key insights and analytical tools for understanding th...
Constrained GA optimization
 In Proc. of 5th Int'l Conf. on Genetic Algorithms
, 1993
"... We present a general method of handling constraints in genetic optimization, based on the Behavioural Memory paradigm. Instead of requiring the problemdependent design of either repair operators (projection on the feasible region) or penalty function (weighted sum of the constraints violations and ..."
Abstract

Cited by 50 (7 self)
 Add to MetaCart
We present a general method of handling constraints in genetic optimization, based on the Behavioural Memory paradigm. Instead of requiring the problemdependent design of either repair operators (projection on the feasible region) or penalty function (weighted sum of the constraints violations and the objective function), we sample the feasible region by evolving from an initially random population, successively applying a series of different fitness functions which embody constraint satisfaction. The final step is the optimization of the objective function restricted to the feasible region. The success of the whole process is highly dependent on the genetic diversity maintained during the first steps, ensuring a uniform sampling of the feasible region. This method succeeded on some truss structure optimization problems, where the other genetic techniques for handling the constraints failed to give good results. Moreover in some domains, as in automatic generation of software test dat...
Finite Markov Chain Analysis of Genetic Algorithms with Niching
 Proceedings of the Fifth International Conference on Genetic Algorithms
, 1993
"... Finite, discretetime Markov chain models of genetic algorithms have been used successfully in the past to understand the complex dynamics of a simple GA. Markov chains can exactly model the GA by accounting for all of the stochasticity introduced by various GA operators, such as initialization, sel ..."
Abstract

Cited by 30 (7 self)
 Add to MetaCart
Finite, discretetime Markov chain models of genetic algorithms have been used successfully in the past to understand the complex dynamics of a simple GA. Markov chains can exactly model the GA by accounting for all of the stochasticity introduced by various GA operators, such as initialization, selection, crossover, and mutation. Although such models quickly become unwieldy with increasing population size or genome length, they provide initial insights that guide our development of approximate, scalable models. In this study, we use Markov chains to analyze the stochastic effects of the "niching operator" of a niched GA. Specifically, we model the effect of fitness sharing on a singlelocus genome. Without niching, our model is an absorbing Markov chain. With niching, we are dealing with a "quasiergodic" Markov chain. Rather than calculating expected times to absorption, we are interested in steadystate probabilities for positive recurrent states. Established techniques for analyzin...
Genetic Algorithm in Search and Optimization: The Technique and Applications
 Proc. of Int. Workshop on Soft Computing and Intelligent Systems
, 1997
"... A genetic algorithm (GA) is a search and optimization method developed by mimicking the evolutionary principles and chromosomal processing in natural genetics. A GA begins its search with a random set of solutions usually coded in binary string structures. Every solution is assigned a fitness which ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
A genetic algorithm (GA) is a search and optimization method developed by mimicking the evolutionary principles and chromosomal processing in natural genetics. A GA begins its search with a random set of solutions usually coded in binary string structures. Every solution is assigned a fitness which is directly related to the objective function of the search and optimization problem. Thereafter, the population of solutions is modified to a new population by applying three operators similar to natural genetic operatorsreproduction, crossover, and mutation. A GA works iteratively by successively applying these three operators in each generation till a termination criterion is satisfied. Over the past one decade, GAs have been successfully applied to a wide variety of problems, because of their simplicity, global perspective, and inherent parallel processing. In this paper, we outline the working principle of a GA by describing these three operators and by outlining an intuitive sketch ...
On the Use of Genetic Algorithm with Elitism in Robust and Nonparametric Multivariate Analysis
"... In this paper, we provide a general formulation for the problems that arise in the computation of many robust and nonparametric estimates in terms of a combinatorial optimization problem. There is virtually no hope for solving such optimization problems exactly for high dimensional data, and peo ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper, we provide a general formulation for the problems that arise in the computation of many robust and nonparametric estimates in terms of a combinatorial optimization problem. There is virtually no hope for solving such optimization problems exactly for high dimensional data, and people usually resort to various approximate algorithms many of which are based on heuristic search strategies. However, for such algorithms it is not guaranteed that they will converge to the global optimum as the number of iterations increases, and there are always possibilities for such algorithms getting trapped in some local optimum. Here we propose genetic algorithm with elitism as a way to solve that general problem by probabilistic search method. We establish convergence of our algorithm to the global optimal solution and demonstrate the performance of this algorithm using some numerical examples.
The Effect of Multiple Optima on the Simple GA Runtime Complexity
"... Genetic Algorithms are stochastic search algorithms that have been applied on optimization problems. In this paper we analyze the runtime complexity of a genetic algorithm when we are interested in one of a set of distinguished solutions such as when multiple optima exist. We define the worst case ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Genetic Algorithms are stochastic search algorithms that have been applied on optimization problems. In this paper we analyze the runtime complexity of a genetic algorithm when we are interested in one of a set of distinguished solutions such as when multiple optima exist. We define the worst case scenario and derive a probabilistic worst case bound on the number of iterations required to find one of these multiple solutions of interest
Quotient Evolutionary Space:Abstraction of Evolutionary process w.r.t macroscopic properties Ambedkar Dukkipati, M. Narasimha Murty and Shalabh BhatnagarDepartment of Computer Science and Automation
"... 1 Introduction Most theories of evolutionary algorithms stress on particular aspects of evolutionary computation by means of specific mechanisms. Evolutionary algorithms can be modeledas Markov chains in a very natural way. For detailed analysis of evolutionary algorithms using Markov chains one ca ..."
Abstract
 Add to MetaCart
1 Introduction Most theories of evolutionary algorithms stress on particular aspects of evolutionary computation by means of specific mechanisms. Evolutionary algorithms can be modeledas Markov chains in a very natural way. For detailed analysis of evolutionary algorithms using Markov chains one canrefer to [1, 2, 3]. This approach can be applied to evolutionary algorithms with finite population based on any kindof selection mechanism and evolution operators. The most difficult issue, however, is that it is impossible or at leastimpractical to formulate the details of the related transition probability matrix and, therefore, analysis of the propertiesof the matrix is difficult [4].
A Comparative Study of Adaptive Crossover Operators for Genetic Algorithms to Resolve the Traveling Salesman Problem
"... Genetic algorithm includes some parameters that should be adjusting so that the algorithm can provide positive results. Crossover operators play very important role by constructing competitive Genetic Algorithms (GAs). In this paper, the basic conceptual features and specific characteristics of vari ..."
Abstract
 Add to MetaCart
Genetic algorithm includes some parameters that should be adjusting so that the algorithm can provide positive results. Crossover operators play very important role by constructing competitive Genetic Algorithms (GAs). In this paper, the basic conceptual features and specific characteristics of various crossover operators in the context of the Traveling Salesman Problem (TSP) are discussed. The results of experimental comparison of more than six different crossover operators for the TSP are presented. The experiment results show that OX operator enables to achieve a better solutions than other operators tested.