Results 11  20
of
532
A Comparison of Linear Genetic Programming and Neural Networks in Medical Data Mining
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 2000
"... We apply linear genetic programming to several diagnosis problems in medicine. An efficient algorithm is presented that eliminates intron code in linear genetic programs. This results in a significant speedup which is especially interesting when operating with complex datasets as they are occuring ..."
Abstract

Cited by 97 (13 self)
 Add to MetaCart
(Show Context)
We apply linear genetic programming to several diagnosis problems in medicine. An efficient algorithm is presented that eliminates intron code in linear genetic programs. This results in a significant speedup which is especially interesting when operating with complex datasets as they are occuring in realworld applications like medicine. We compare our results to those obtained with neural networks and argue that genetic programming is able to show similar performance in classification and generalization even within a relatively small number of generations.
Reevaluating Genetic Algorithm Performance under Coordinate Rotation of Benchmark Functions  A survey of some theoretical and practical aspects of genetic algorithms
 BioSystems
, 1995
"... This work analyzes some concepts of genetic algorithms and explains why they may be applied with success to some problems in function optimization. In addition to other performance properties, it has been shown that genetic algorithms are able to overcome local minima in highly multimodal functions ..."
Abstract

Cited by 89 (18 self)
 Add to MetaCart
(Show Context)
This work analyzes some concepts of genetic algorithms and explains why they may be applied with success to some problems in function optimization. In addition to other performance properties, it has been shown that genetic algorithms are able to overcome local minima in highly multimodal functions (e.g., Rastrigin, Schwefel). The performance of genetic algorithms is supported by an extensive theory, which is based on the assumption of additive gene effects. But the current work shows that the assumption of additive gene effects is not weak, and that the dependence on specific parameter settings is much stronger than often believed. Furthermore, the assumptions regarding the fitness function are so restricting that slight modifications of the standard test functions cause a failure of the optimization procedure even though the function's structure is preserved. The current experiments focus on a few widelyused scalable test functions. the results indicate that a standard g...
Adaptive and selfadaptive evolutionary computation,” in Computational Intelligence: A Dynamic System Perspective
, 1995
"... This paper reviews the various studies that have introduced adaptive and selfadaptive parameters into Evolutionary Computations. A formal definition of an adaptive evolutionary computation is provided with an analysis of the types of adaptive and selfadaptive parameter update rules currently in us ..."
Abstract

Cited by 86 (3 self)
 Add to MetaCart
(Show Context)
This paper reviews the various studies that have introduced adaptive and selfadaptive parameters into Evolutionary Computations. A formal definition of an adaptive evolutionary computation is provided with an analysis of the types of adaptive and selfadaptive parameter update rules currently in use. Previous studies are reviewed and placed into a categorization that helps to illustrate their similarities and differences.
A Connectionist Central Pattern Generator for the Aquatic and Terrestrial Gaits of a Simulated Salamander
, 2001
"... This article investigates the neural mechanisms underlying salamander locomotion, and develops a biologically plausible connectionist model of a central pattern generator capable of producing the typical aquatic and terrestrial gaits of the salamander. It investigates, in particular, what type of ne ..."
Abstract

Cited by 85 (24 self)
 Add to MetaCart
This article investigates the neural mechanisms underlying salamander locomotion, and develops a biologically plausible connectionist model of a central pattern generator capable of producing the typical aquatic and terrestrial gaits of the salamander. It investigates, in particular, what type of neural circuitry can produce and modulate the two locomotor programs identified within the salamander's spinal cord, namely, a traveling wave of neural activity for swimming and a standing wave for trotting. A twodimensional biomechanical simulation of the salamander's body is developed whose muscle contraction is determined by the locomotion controller simulated as a leakyintegrator neural network. While the connectivity of the neural circuitry underlying locomotion in the salamander has not been decoded for the moment, this article presents the design of a neural circuit which has a general organization corresponding to that hypothesized by neurobiologists. In particular, the locomotion c...
Evolutionary neurocontrollers for autonomous mobile robots
 NEURAL NETWORKS
, 1998
"... In this article we describe a methodology for evolving neurocontrollers of autonomous mobile robots without human intervention. The presentation, which spans from technological and methodological issues to several experimental results on evolution of physical mobile robots, covers both previous and ..."
Abstract

Cited by 83 (10 self)
 Add to MetaCart
(Show Context)
In this article we describe a methodology for evolving neurocontrollers of autonomous mobile robots without human intervention. The presentation, which spans from technological and methodological issues to several experimental results on evolution of physical mobile robots, covers both previous and recent work in the attempt to provide a uni ed picture within which the reader can compare the effects of systematic variations on the experimental settings. After describing some key principles for building mobile robots and tools suitable for experiments in adaptive robotics, we give an overview of different approaches to evolutionary robotics and present our methodology. We start reviewing two basic experiments showing that different environments can shape very different behaviors and neural mechanisms under very similar selection criteria. We then address the issue of incremental evolution in two different experiments from the perspective of changing environments and robot morphologies. Finally, we investigate the possibility of evolving plastic neurocontrollers and analyze an evolved neurocontroller that relies on fast and continuously changes synapses characterized by dynamic stability. We conclude by reviewing the implications of this methodology for engineering, biology, cognitive science, and artificial life, and point at future directions of research.
Drift analysis and average time complexity of evolutionary algorithms
 Artificial Intelligence
, 2001
"... The computational time complexity is an important topic in the theory of evolutionary algorithms (EAs). This paper reports some new results on the average time complexity of EAs. Based on drift analysis, some useful drift conditions for deriving the time complexity of EAs are studied, including cond ..."
Abstract

Cited by 75 (27 self)
 Add to MetaCart
(Show Context)
The computational time complexity is an important topic in the theory of evolutionary algorithms (EAs). This paper reports some new results on the average time complexity of EAs. Based on drift analysis, some useful drift conditions for deriving the time complexity of EAs are studied, including conditions under which an EA will take no more than polynomial time (in problem size) to solve a problem and conditions under which an EA will take at least exponential time (in problem size) to solve a problem. The paper first presents the general results, and then uses several problems as examples to illustrate how these general results can be applied to concrete problems in analyzing the average time complexity of EAs. While previous work only considered (1 + 1) EAs without any crossover, the EAs considered in this paper are fairly general, which use a finite population, crossover, mutation, and selection.
Toward a Theory of Evolution Strategies: SelfAdaptation
, 1995
"... This paper analyzes the SelfAdaptation (SA) algorithm widely used to adapt strategy parameters of the Evolution Strategy (ES) in order to obtain maximal ESperformance. The investigations are concentrated on the adaptation of one general mutation strength oe (called oeSA) in (1; ) ESs. The hypersph ..."
Abstract

Cited by 70 (21 self)
 Add to MetaCart
This paper analyzes the SelfAdaptation (SA) algorithm widely used to adapt strategy parameters of the Evolution Strategy (ES) in order to obtain maximal ESperformance. The investigations are concentrated on the adaptation of one general mutation strength oe (called oeSA) in (1; ) ESs. The hypersphere serves as the fitness model. Starting from an introduction into the basic concept of selfadaptation, a framework for the analysis of oeSA is developed on two levels: a microscopic level concerning the description of the stochastic changes from one generation to the next, and a macroscopic level describing the evolutionary dynamics of the oeSA over the time (generations). The oeSA requires the fixing of a new strategy parameter, the socalled learning parameter. The influence of this parameter on the ES performance is investigated and rules for its tuning are presented and discussed. The results of the theoretical analysis are compared with ES experiments and it will be shown that apply...
Adaptive Stochastic Approximation by the Simultaneous Perturbation Method
, 2000
"... Stochastic approximation (SA) has long been applied for problems of minimizing loss functions or root finding with noisy input information. As with all stochastic search algorithms, there are adjustable algorithm coefficients that must be specified, and that can have a profound effect on algorithm p ..."
Abstract

Cited by 69 (4 self)
 Add to MetaCart
Stochastic approximation (SA) has long been applied for problems of minimizing loss functions or root finding with noisy input information. As with all stochastic search algorithms, there are adjustable algorithm coefficients that must be specified, and that can have a profound effect on algorithm performance. It is known that choosing these coefficients according to an SA analog of the deterministic NewtonRaphson algorithm provides an optimal or nearoptimal form of the algorithm. However, directly determining the required Hessian matrix (or Jacobian matrix for root finding) to achieve this algorithm form has often been difficult or impossible in practice. This paper presents a general adaptive SA algorithm that is based on a simple method for estimating the Hessian matrix, while concurrently estimating the primary parameters of interest. The approach applies in both the gradientfree optimization (KieferWolfowitz) and rootfinding/stochastic gradientbased (RobbinsMonro) settings, and is based on the "simultaneous perturbation (SP)" idea introduced previously. The algorithm requires only a small number of loss function or gradient measurements per iterationindependent of the problem dimensionto adaptively estimate the Hessian and parameters of primary interest. Aside from introducing the adaptive SP approach, this paper presents practical implementation guidance, asymptotic theory, and a nontrivial numerical evaluation. Also included is a discussion and numerical analysis comparing the adaptive SP approach with the iterateaveraging approach to accelerated SA.
Contemporary Evolution Strategies
, 1995
"... After an outline of the history of evolutionary algorithms, a new (¯; ; ; ae) variant of the evolution strategies is introduced formally. Though not comprising all degrees of freedom, it is richer in the number of features than the meanwhile old (¯; ) and (¯+) versions. Finally, all important theor ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
(Show Context)
After an outline of the history of evolutionary algorithms, a new (¯; ; ; ae) variant of the evolution strategies is introduced formally. Though not comprising all degrees of freedom, it is richer in the number of features than the meanwhile old (¯; ) and (¯+) versions. Finally, all important theoretically proven facts about evolution strategies are briefly summarized and some of many open questions concerning evolutionary algorithms in general are pointed out.