## The Role of Development in Genetic Algorithms (1994)

Venue: | Foundations of Genetic Algorithms 3 |

Citations: | 23 - 6 self |

### BibTeX

@INPROCEEDINGS{Hart94therole,

author = {William E. Hart and Thomas E. Kammeyer and Richard K. Belew},

title = {The Role of Development in Genetic Algorithms},

booktitle = {Foundations of Genetic Algorithms 3},

year = {1994},

pages = {315--332},

publisher = {Morgan Kaufmann}

}

### Years of Citing Articles

### OpenURL

### Abstract

The developmental mechanisms transforming genotypic to phenotypic forms are typically omitted in formulations of genetic algorithms (GAs) in which these two representational spaces are identical. We argue that a careful analysis of developmental mechanisms is useful when understanding the success of several standard GA techniques, and can clarify the relationships between more recently proposed enhancements. We provide a framework which distinguishes between two developmental mechanisms --- learning and maturation --- while also showing several common effects on GA search. This framework is used to analyze how maturation and local search can change the dynamics of the GA. We observe that in some contexts, maturation and local search can be incorporated into the fitness evaluation, but illustrate reasons for considering them seperately. Further, we identify contexts in which maturation and local search can be distinguished from the fitness evaluation. The Role of Development in Geneti...

### Citations

338 | How learning can guide evolution
- Hinton, Nowlan
- 1987
(Show Context)
Citation Context ...cape by associating the fitness of a genotype with the fitness of the phenotype generated by a local search algorithm. This type of transformation tends to broaden the "shoulders" of the loc=-=al minima [9, 6]-=-. Hinton and Nowlan [9], Nolfi, Elman and Parisi [18], Keesing and Stork [12] have shown how this type of fitness transformation can improve the rate at which the GA generates good solutions. Although... |

310 |
Designing neural networks using genetic algorithms with graph generation system
- Kitano
- 1990
(Show Context)
Citation Context ...veral contexts. Learning has been studied under the guise of "hill climbing" [16, 17] and local search [8]. These authors use learning to refine solutions and speed up the GA. Gruau [4, 5] a=-=nd Kitano [13]-=- have used maturation in applications of GAs to neural networks. These authors used maturation to construct neural networks from a grammar representation. Similarly, Belew and Kammeyer [2] used matura... |

195 |
The parallel genetic algorithm as function optimizer
- Mühlenbein, Schomisch, et al.
- 1991
(Show Context)
Citation Context ...hin-lifetime change are notoriously confounded in biological systems. Maturation and learning have been used with GAs in several contexts. Learning has been studied under the guise of "hill climb=-=ing" [16, 17]-=- and local search [8]. These authors use learning to refine solutions and speed up the GA. Gruau [4, 5] and Kitano [13] have used maturation in applications of GAs to neural networks. These authors us... |

179 | Evolving networks: Using the genetic algorithm with connectionist learning
- Belew, McInerny, et al.
- 1991
(Show Context)
Citation Context ...marckian local search with approximations to ffi \Gamma1 (e.g., see Hart [7, pages 105-110]). GAs using Lamarckian local search are typically more efficient than GAs using non-Lamarckian local search =-=[3, 8, 11]-=- when ffi is invertible. Consequently, GAs using Lamarckian local search based an approximation to ffi \Gamma1 are likely to be of practical interest. Figure 2 shows pseudo-code for a simple GA that u... |

143 | Learning and evolution in neural networks
- Elman, Parisi
- 1990
(Show Context)
Citation Context ...fitness of the phenotype generated by a local search algorithm. This type of transformation tends to broaden the "shoulders" of the local minima [9, 6]. Hinton and Nowlan [9], Nolfi, Elman a=-=nd Parisi [18]-=-, Keesing and Stork [12] have shown how this type of fitness transformation can improve the rate at which the GA generates good solutions. Although non-Lamarckian local search really only offers one t... |

115 |
Minimization by random search techniques
- Solis, Wets
- 1981
(Show Context)
Citation Context ...henotypic space for both GAs. A GA with floating point representation was used to search these genotypic spaces [7]. Local search was performed in the coordinate space, using the method of Solis-Wets =-=[16, 21]-=-. The performance of the GAs was measured as the best solution found after 150,000 function evaluations. Results were averaged over 20 trials. Table 1 shows the performance for the GAs using different... |

110 | Evolution in Time and Space – The Parallel Genetic Algorithm
- Mühlenbein
- 1991
(Show Context)
Citation Context ...hin-lifetime change are notoriously confounded in biological systems. Maturation and learning have been used with GAs in several contexts. Learning has been studied under the guise of "hill climb=-=ing" [16, 17]-=- and local search [8]. These authors use learning to refine solutions and speed up the GA. Gruau [4, 5] and Kitano [13] have used maturation in applications of GAs to neural networks. These authors us... |

89 | Adaptive global optimization with local search
- Hart
- 1994
(Show Context)
Citation Context ...finitions of local search and mutation which use information about a GA's current or previous populations. For example, the definition of local search does not encompass the methods described in Hart =-=[7]-=- that use statistics from the population to selectively apply local search. Our notion of mutation does not include the dynamically adjusted mutation operator used in evolutionary strategies, in which... |

81 | Adding Learning to the Cellular Development of Neural Networks: Evolution and the Baldwin Effect. Evolutionary Computation
- Gruau, Whitley
- 1993
(Show Context)
Citation Context ...cape by associating the fitness of a genotype with the fitness of the phenotype generated by a local search algorithm. This type of transformation tends to broaden the "shoulders" of the loc=-=al minima [9, 6]-=-. Hinton and Nowlan [9], Nolfi, Elman and Parisi [18], Keesing and Stork [12] have shown how this type of fitness transformation can improve the rate at which the GA generates good solutions. Although... |

58 |
Shall We Repair? Genetic Algorithms, Combinatorial Optimization, and Feasibility Constraints
- Orvosh, Davis
- 1993
(Show Context)
Citation Context ...ms 12 Frequency Angle/Bond Repn Angle Repn 1.0 0.119-3.373 0.25 3.473-6.472 0.0625 18.470-9.450 Table 1: Conformation experiments using a GA and varying local search frequency. by a number of authors =-=[15, 19]-=-. For example, consider constrained optimization problems. In general, a constrained optimization problem solves for x such that f(x ) = min x2D f(x) subject to c i (x)s0 i = 1; : : : ; m g j (x) = 0 ... |

51 |
Genetic synthesis of boolean neural networks with a cell rewriting developmental process
- Gruau
- 1992
(Show Context)
Citation Context ...sed with GAs in several contexts. Learning has been studied under the guise of "hill climbing" [16, 17] and local search [8]. These authors use learning to refine solutions and speed up the =-=GA. Gruau [4, 5]-=- and Kitano [13] have used maturation in applications of GAs to neural networks. These authors used maturation to construct neural networks from a grammar representation. Similarly, Belew and Kammeyer... |

51 |
Genetic synthesis of modular neural networks
- Gruau
- 1993
(Show Context)
Citation Context ...sed with GAs in several contexts. Learning has been studied under the guise of "hill climbing" [16, 17] and local search [8]. These authors use learning to refine solutions and speed up the =-=GA. Gruau [4, 5]-=- and Kitano [13] have used maturation in applications of GAs to neural networks. These authors used maturation to construct neural networks from a grammar representation. Similarly, Belew and Kammeyer... |

31 |
Optimization with genetic algorithm hybrids that use local search
- Hart, Belew
- 1994
(Show Context)
Citation Context ...toriously confounded in biological systems. Maturation and learning have been used with GAs in several contexts. Learning has been studied under the guise of "hill climbing" [16, 17] and loc=-=al search [8]-=-. These authors use learning to refine solutions and speed up the GA. Gruau [4, 5] and Kitano [13] have used maturation in applications of GAs to neural networks. These authors used maturation to cons... |

22 |
Interposing an ontogenic model between genetic algorithms and neural networks
- Belew
- 1993
(Show Context)
Citation Context ...n, however, does not capture the structure of a network. For example, a sorting network can be built by building two networks of half as many inputs, renumbering their inputs so that one operates on A=-=[1]-=-; : : : A[N=2 + 1] and the other on A[N=2]; : : : A[N ]), and adding some CMPXs to the result. 3 For example, we would have to deal with or eliminate operations like [5 : 5] or [7 : 2] or any operatio... |

16 |
Evolution and learning in neural networks: The number and distribution of learning trials aect the rate of evolution
- Keesing, Stork
- 1991
(Show Context)
Citation Context ...on of a minimum that is flattened by the fitness transformation. For example, a local search algorithm is more likely to find a solution near a minimum when run for many iterations. Keesing and Stork =-=[12]-=- apply local search at several different lengths to alleviate this problem when using long searches. 4.2 Comparing Maturation and Local Search Although maturation and non-Lamarckian local search perfo... |

15 |
Do intelligent configuration search techniques outperform random search for large molecules
- Judson, Colvin, et al.
- 1992
(Show Context)
Citation Context ...marckian local search with approximations to ffi \Gamma1 (e.g., see Hart [7, pages 105-110]). GAs using Lamarckian local search are typically more efficient than GAs using non-Lamarckian local search =-=[3, 8, 11]-=- when ffi is invertible. Consequently, GAs using Lamarckian local search based an approximation to ffi \Gamma1 are likely to be of practical interest. Figure 2 shows pseudo-code for a simple GA that u... |

13 |
Evolving aesthetic sorting networks using developmental grammars
- Belew, Kammeyer
- 1993
(Show Context)
Citation Context ...and Kitano [13] have used maturation in applications of GAs to neural networks. These authors used maturation to construct neural networks from a grammar representation. Similarly, Belew and Kammeyer =-=[2] used maturation in -=-an application of the GA to sorting networks. In GAs, we have complete knowledge of all of the algorithm's mechanisms, and distinctions between "learning" and "maturation" may be c... |

8 |
Teaching polymers to fold
- Judson
- 1992
(Show Context)
Citation Context ...fitness evaluation is expensive, because the network must be tested on 2 width strings after each modification. 3.3 Molecular Conformation A simple molecular conformation problem is taken from Judson =-=[10]-=-. Consider a molecule composed of a chain of 19 identical atoms which are connected by stiff springs. A simple equation for the potential energy of this molecule is E = 100 n\Gamma1 X i=1 (r i;i+1 \Ga... |

2 |
Fitting spline functions to noisy data using a genetic algorithm
- Manela, Thornhill, et al.
- 1993
(Show Context)
Citation Context ... ! R and g j : D ! R. Let D = fx j c i (x)s0g T fx j g j (x) = 0g. Solutions in D are known as feasible solutions, and solutions in D \Gamma D are infeasible solutions. Manela, Thornhill and Campbell =-=[14]-=- describes a GA that performs constrained optimization using a representational mapping (decoder) that maps from G to D. Michalewicz and Janikow [15] observe that this type of GA can use a representat... |