Results 1  10
of
50
An Overview of Evolutionary Computation
, 1993
"... Evolutionary computation uses computational models of evolutionary processes as key elements in the design and implementation of computerbased problem solving systems. In this paper we provide an overview of evolutionary computation, and describe several evolutionary algorithms that are current ..."
Abstract

Cited by 106 (5 self)
 Add to MetaCart
Evolutionary computation uses computational models of evolutionary processes as key elements in the design and implementation of computerbased problem solving systems. In this paper we provide an overview of evolutionary computation, and describe several evolutionary algorithms that are currently of interest. Important similarities and differences are noted, which lead to a discussion of important issues that need to be resolved, and items for future research.
Crossover or Mutation?
 Foundations of Genetic Algorithms 2
, 1992
"... Genetic algorithms rely on two genetic operators  crossover and mutation. Although there exists a large body of conventional wisdom concerning the roles of crossover and mutation, these roles have not been captured in a theoretical fashion. For example, it has never been theoretically shown that mu ..."
Abstract

Cited by 70 (3 self)
 Add to MetaCart
Genetic algorithms rely on two genetic operators  crossover and mutation. Although there exists a large body of conventional wisdom concerning the roles of crossover and mutation, these roles have not been captured in a theoretical fashion. For example, it has never been theoretically shown that mutation is in some sense "less powerful" than crossover or vice versa. This paper provides some answers to these questions by theoretically demonstrating that there are some important characteristics of each operator that are not captured by the other.
Evolutionary Module Acquisition
 Proceedings of the Second Annual Conference on Evolutionary Programming
, 1993
"... Evolutionary programming and genetic algorithms share many features, not the least of which is a reliance of an analogy to natural selection over a population as a means of implementing search. With their commonalities come shared problems whose solutions can be investigated at a higher level and ap ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
Evolutionary programming and genetic algorithms share many features, not the least of which is a reliance of an analogy to natural selection over a population as a means of implementing search. With their commonalities come shared problems whose solutions can be investigated at a higher level and applied to both. One such problem is the manipulation of solution parameters whose values encode a desirable subsolution. In this paper, we define a superset of evolutionary programming and genetic algorithms, called evolutionary algorithms, and demonstrate a method of automatic modularization that protects promising partial solutions and speeds acquisition time. 1. Introduction Evolutionary programming (EP) (Fogel 1992; Fogel et. al. 1966) and genetic algorithms (GAs) (Holland 1966; Goldberg 1989) have borrowed little from each other. But there are many levels at which EP and GAs are similar. For instance, both employ an analogy to natural selection over a population to search through a sp...
Generality and Difficulty in Genetic Programming: Evolving a Sort
, 1993
"... Genetic Programming is applied to the task of evolving general iterative sorting algorithms. A connection between size and generality was discovered. Adding inverse size to the fitness measure along with correctness not only decreases the size of the resulting evolved algorithms, but also dramatical ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
Genetic Programming is applied to the task of evolving general iterative sorting algorithms. A connection between size and generality was discovered. Adding inverse size to the fitness measure along with correctness not only decreases the size of the resulting evolved algorithms, but also dramatically increases their generality and thus the effectiveness of the evolution process. In addition, a variety of differing problem formulations are investigated and the relative probability of success for each is reported. An example of an evolved sort from each problem formulation is presented, and an initial attempt is made to understand the variations in difficulty resulting from these differing problem formulations. 1 Introduction In order to further the application of Genetic Programming to evolution of complex algorithms, the work reported here explores the impact of differing problem formulations and fitness measures on the likelihood of evolving a general sorting algorithm on a given G...
Genetic Programming in C++: Implementation Issues
, 1994
"... Introduction In this chapter we explore the lower level implementation issues surrounding what we call the Genome Interpreter. Provided is example code from 5 test programs which were used to evaluate performance. Section 13.8 summarizes the results of these tests and discusses the tradeoffs invol ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
Introduction In this chapter we explore the lower level implementation issues surrounding what we call the Genome Interpreter. Provided is example code from 5 test programs which were used to evaluate performance. Section 13.8 summarizes the results of these tests and discusses the tradeoffs involved with the various implementations. For the upcoming discussion, what we call an interpreter specifies the following lower level aspects of the design: . the raw node representation . how a tree of nodes is represented . the method for evaluating an individual node . the method for evaluating the tree as a whole . the methods for (or methods to assist) those genetic operators which are dependent on the node or tree representation. A key point is that the interpreter specifies the node implementation which is the particular part of the platformcoding in which the overhead will be magnified. Therefore, the interpreter is the most crucial component in the overall design with respect t
Emergent Linguistic Rules from Inducing Decision Trees: Disambiguating Discourse Clue Words
 Proc. Annual Meeting of the American Association for Artificial Intelligence
, 1994
"... We apply decision tree induction to the problem of discourse clue word sense disambiguation. The automatic partitioning of the training set which is intrinsic to decision tree induction gives rise to linguistically viable rules. ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
We apply decision tree induction to the problem of discourse clue word sense disambiguation. The automatic partitioning of the training set which is intrinsic to decision tree induction gives rise to linguistically viable rules.
Partial abductive inference in Bayesian belief networks using a genetic algorithm
 Pattern Recognit. Lett
, 1999
"... Abstract—Abductive inference in Bayesian belief networks (BBNs) is intended as the process of generating the most probable configurations given observed evidence. When we are interested only in a subset of the network’s variables, this problem is called partial abductive inference. Both problems are ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Abstract—Abductive inference in Bayesian belief networks (BBNs) is intended as the process of generating the most probable configurations given observed evidence. When we are interested only in a subset of the network’s variables, this problem is called partial abductive inference. Both problems are NPhard and so exact computation is not always possible. In this paper, a genetic algorithm is used to perform partial abductive inference in BBNs. The main contribution is the introduction of new genetic operators designed specifically for this problem. By using these genetic operators, we try to take advantage of the calculations previously carried out, when a new individual is evaluated. The algorithm is tested using a widely used Bayesian network and a randomly generated one and then compared with a previous genetic algorithm based on classical genetic operators. From the experimental results, we conclude that the new genetic operators preserve the accuracy of the previous algorithm, and also reduce the number of operations performed during the evaluation of individuals. The performance of the genetic algorithm is, thus, improved. Index Terms—Abductive inference, bayesian belief networks, evolutionary computation, genetic operators, most probable explanation, probabilistic reasoning. I.
Optimization by Genetic Annealing
 Proc. of Second Australian Conf. on Neural Networks
, 1991
"... Simulated Annealing (SA) is a general stochastic search algorithm. It is usually employed as an optimization method to find a near optimal solution for hard combinatorial optimization problems, but it is very difficult to give the accuracy of the solution found. In order to find a better solution, a ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
Simulated Annealing (SA) is a general stochastic search algorithm. It is usually employed as an optimization method to find a near optimal solution for hard combinatorial optimization problems, but it is very difficult to give the accuracy of the solution found. In order to find a better solution, an often used strategy is to run the algorithm many times and select the best solution as the final one. This paper gives an algorithm called Genetic Annealing (GA), which connects each run of SA and gradually improve the solution. It introduces the concept of evolution into the annealing process. The basic idea is to use genetic operations adopted in genetic algorithms to inherit the possible benefits of the solutions found in former runs. Experiments have shown that GA is better than classical SA. The parallelization of GA is also discussed in the paper. keywords  Simulated Annealing, Genetic Algorithms, Combinatorial Optimization. 1 Introduction SA is a general stochastic search metho...
Clonal selection algorithms: A comparative case study using effective mutation potentials
 in 4th International Conference on Artificial Immune Systems (ICARIS), LNCS 4163
, 2005
"... Abstract. This paper presents a comparative study of two important Clonal Selection Algorithms (CSAs): CLONALG and optIA. To deeply understand the performance of both algorithms, we deal with four different classes of problems: toy problems (onecounting and trap functions), pattern recognition, nu ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
Abstract. This paper presents a comparative study of two important Clonal Selection Algorithms (CSAs): CLONALG and optIA. To deeply understand the performance of both algorithms, we deal with four different classes of problems: toy problems (onecounting and trap functions), pattern recognition, numerical optimization problems and NPcomplete problem (the 2D HP model for protein structure prediction problem). Two possible versions of CLONALG have been implemented and tested. The experimental results show a global better performance of optIA with respect to CLONALG. Considering the results obtained, we can claim that CSAs represent a new class of Evolutionary Algorithms for effectively performing searching, learning and optimization tasks.
Differential Evolution Using a NeighborhoodBased Mutation Operator
, 2009
"... Differential evolution (DE) is well known as a simple and efficient scheme for global optimization over continuous spaces. It has reportedly outperformed a few evolutionary algorithms (EAs) and other search heuristics like the particle swarm optimization (PSO) when tested over both benchmark and re ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
Differential evolution (DE) is well known as a simple and efficient scheme for global optimization over continuous spaces. It has reportedly outperformed a few evolutionary algorithms (EAs) and other search heuristics like the particle swarm optimization (PSO) when tested over both benchmark and realworld problems. DE, however, is not completely free from the problems of slow and/or premature convergence. This paper describes a family of improved variants of the DE/targettobest/1/bin scheme, which utilizes the concept of the neighborhood of each population member. The idea of small neighborhoods, defined over the indexgraph of parameter vectors, draws inspiration from the community of the PSO algorithms. The proposed schemes balance the exploration and exploitation abilities of DE without imposing serious additional burdens in terms of function evaluations. They are shown to be statistically significantly better than or at least comparable to several existing DE variants as well as a few other significant evolutionary computing techniques over a test suite of 24 benchmark functions. The paper also investigates the applications of the new DE variants to two reallife problems concerning parameter estimation for frequency modulated sound waves and spread spectrum radar polyphase code design.