## Adaptive Global Optimization with Local Search (1994)

Citations: | 88 - 6 self |

### BibTeX

@TECHREPORT{Hart94adaptiveglobal,

author = {William Eugene Hart},

title = {Adaptive Global Optimization with Local Search},

institution = {},

year = {1994}

}

### Years of Citing Articles

### OpenURL

### Abstract

### Citations

10982 |
Computers and Intractability: A guide to the theory of NP-completeness
- Garey, Johnson
- 1978
(Show Context)
Citation Context ...is Consider F, the class of all deterministic pseudo-boolean functions f such that f : B l ! Z, where B = f0; 1g. We can formalize the problem that the GA 1 The reader is referred to Gary and Johnson =-=[24]-=- for an excellent discussion of the complexity differences between P and NP , and to Gill [25] for an exposition of probabilistic computation. 38 attempts to solve as a combinatorial optimization prob... |

7415 |
Genetic Algorithms
- Goldberg
- 1989
(Show Context)
Citation Context ... evolutionary mechanisms and are often used to perform global optimization. The exemplars of evolutionary search algorithms are genetic algorithms, evolutionary strategie and evolutionary programming =-=[5, 22, 31]-=-. The design and motivation for these algorithms are different, but they incorporate the same basic adaptive components [4, 41]. These methods use a collection of solutions (population of individuals)... |

3940 |
Pattern Classification and Scene Analysis
- Duda, Hart
- 1973
(Show Context)
Citation Context ...used to minimize continuous functions using gradient information. Finally, stochastic approximation is used in pattern recognition methods 10 to find the optimal weights for parametric models of data =-=[19]. II.A.1 R-=-andom Local Search Solis and Wets [84] propose several random local search methods for performing local search on smooth functions without derivative information. Their "Algorithm 1" uses no... |

3849 |
Introduction to Automata Theory, Languages, and Computation
- Hopcroft, Ullman
- 1979
(Show Context)
Citation Context ...rs the GA's computational complexity for a very broad class of functions. I assume that the reader is familiar with formal language theory and follow the notational conventions of Hopcroft and Ullman =-=[43]-=-. Recall that P refers to the class of formal languages that can be recognized by a deterministic Turing machine (TM) in polynomial time. Additionally, both NP and RP refer to the classes of formal la... |

3589 | Optimization by simulated annealing
- Kirkpatrick, Gelatt, et al.
- 1983
(Show Context)
Citation Context ...a method of optimization inspired by an analogy between a physical annealing process for obtaining low energy states and the process of solving for minimal solutions to discrete optimization problems =-=[11, 51]-=-. SA sequentially generates random deviates of the current solution that are accepted if a probabilistic test is passed. Suppose x 0 is the current solution and let x 00 be the new deviate. If f(x 00 ... |

2840 |
Adaptation in natural and artificial systems
- Holland
- 1992
(Show Context)
Citation Context ...search with this algorithm. II.C.1 Genetic Algorithms The GA was initially described using populations of binary strings in f0; 1g n , which are evaluated by the objective function (fitness function) =-=[42, 31, 57]-=-. When searching spaces other than f0; 1g n , the objective function decodes the binary string and performs the function evaluation. Holland [42] proposed a selection mechanism that stochastically sel... |

2746 |
Learning internal representations by error propagation
- Rumelhart, Hinton, et al.
- 1986
(Show Context)
Citation Context ...value and the actual y value, y i . A common error function is the squared error E(a; b) = ka \Gamma bk 2 : Examples of parametric models are linear models [19], logit models [13] and neural networks =-=[81]-=-. Both random local search and conjugate gradient methods can be used to minimize J(w), since gradient information is typically available for this function. An alternative method of minimizing J(w) is... |

1981 | Genetic Algorithm + Data Structures = Evolution - Michalewicz - 1994 |

1789 |
Introduction to the theory of neural computation
- Hertz, Krogh, et al.
- 1991
(Show Context)
Citation Context ...t 5w J(w) 104 Batch BP can be viewed as a simple gradient descent procedure. Unlike BP, the gradient calculation in BP is a reliable estimate of the current descent direction. Hertz, Krogh and Palmer =-=[39]-=- note that the relative performance of BP and batch BP is problem dependent, though BP seems superior in many cases. Table VII.1 compares the performance of MC, MS, GA and GA-LS hybrids for the three ... |

1363 |
Practical Optimization
- Gill, Murray, et al.
- 1984
(Show Context)
Citation Context ...l computer science and numerical optimization. An important distinction among local search methods concerns whether they minimize in the presence of constraints that restrict the domain of the search =-=[26]-=-. This dissertation examines methods for unconstrained 8 9 optimization. Theoretical computer science is primarily interested in local search methods over discrete spaces. Johnson, Papadimitriou and Y... |

1281 |
Combinatorial Optimization: Algorithms and Complexity
- Papadimitriou, Steiglitz
- 1982
(Show Context)
Citation Context ...n P and NP , and to Gill [25] for an exposition of probabilistic computation. 38 attempts to solve as a combinatorial optimization problem DGA-MAX (following the format of Papadimitriou and Steiglitz =-=[72]-=-): Definition 1 DGA-MAX The Genetic Algorithm combinatorial maximization problem that (1) uses a deterministic fitness function f and (2) assigns the fitness of the maximally fit individual in a popul... |

1179 | Handbook of Genetic Algorithms
- Davis, L
- 1991
(Show Context)
Citation Context ...For example, if the binary string is decoded into a vector of integers or floating point values, then crossover is often applied only between the integer or floating point values on the binary string =-=[15]-=-. II.C.2 Panmictic and Geographically Structured Genetic Algorithms GAs can be distinguished by the manner in which the selection mechanism and genetic operators are applied to the population. Panmict... |

870 |
An analysis of the behavior of a class of genetic adaptive systems. Doctoral dissertation
- Jong
- 1975
(Show Context)
Citation Context ...Optimization Methods IV.C.1 Floating Point GA I use a GA with a floating point encoding in the experiments. GAs have traditionally used binary encodings of real numbers to perform optimization on R n =-=[16]-=-. While binary encodings have been used to successfully solve optimization problems, special manipulation of this encoding is often necessary to increase the efficiency of the algorithm [83, 100]. The... |

530 |
Handbook of Genetic Algorithms
- Syswerda
- 1991
(Show Context)
Citation Context ... how and why certain bit patterns (schemata) will be propagated from one generation to the next. This can be used to analyze the effectiveness of different genetic operators (see for example Syswerda =-=[89]-=-). Related analysis with 36 37 Walsh functions has also proven very rewarding. Walsh functions can be used to analyze the effectiveness of genetic operators, as well as analyze the difficulty of the f... |

499 |
Genetic algorithms with sharing for multimodal function optimization
- Goldberg, Richardson
- 1987
(Show Context)
Citation Context ...rage number of individuals that are identical to the i-th individual. Second, note that the complete method is closely related to the the method of fitness sharing proposed by Goldberg and Richardson =-=[29]-=-. Fitness sharing is a method of inducing niche behavior in GAs that enables the GA to converge to a population that is distributed over several local optima. This method modifies the fitness measure ... |

331 | How learning c n guide evolution
- Hinton, Nowlan
- 1987
(Show Context)
Citation Context ...A. This type of GA-LS hybrid is particularly interesting because the global and local search methods can influence each other's behavior. An important example of this phenomenon is the Baldwin effect =-=[6, 40]-=- in which learning in natural systems speeds up the rate of evolutionary change. Similar effects have been observed by a number of authors using GA-LS hybrids [7, 40, 50]. The research in this dissert... |

244 |
A Connectionist Machine for Genetic Hill-climbing
- Ackley
- 1987
(Show Context)
Citation Context ...a number of very good heuristics for the local improvement of a solution. Other applications include the mapping problem [64] and molecular conformation problems [49]. Muhlenbein [60, 61, 62], Ackley =-=[1]-=-, and McInerney [56] have developed application-independent versions of the GA for optimization with local search. In most of these applications, the performance of the GA is substantially improved wh... |

221 |
Computational complexity of probabilistic Turing machines
- Gill
- 1977
(Show Context)
Citation Context ..., where B = f0; 1g. We can formalize the problem that the GA 1 The reader is referred to Gary and Johnson [24] for an excellent discussion of the complexity differences between P and NP , and to Gill =-=[25]-=- for an exposition of probabilistic computation. 38 attempts to solve as a combinatorial optimization problem DGA-MAX (following the format of Papadimitriou and Steiglitz [72]): Definition 1 DGA-MAX T... |

201 | The royal road for genetic algorithms: Fitness landscapes and GA performance. In: Towards a Practice of Autonomous Systems
- Mitchell, Forrest, et al.
- 1991
(Show Context)
Citation Context ...lyses have examined the performance of GAs on classes of functions that are motivated by an analysis of the role of the crossover operator. Forrest and Mitchell [23] and Mitchell, Holland and Forrest =-=[58]-=- have examined the performance of the GA on a subclass of Walsh polynomials. These analyses have yet to make definite predictions of the performance of GAs, but have provided much insight into the way... |

199 |
Training feedforward neural networks using genetic algorithms
- Montana, Davis
- 1989
(Show Context)
Citation Context ...the methods using the L 2 metric have this property. Finally, we note that our analysis of GA-LS hybrids may explain the performance that other researchers have observed in their GA-LS hybrids. Davis =-=[59]-=- and Muhlenbein [61] have observed that local search is not needed in the initial stages of the optimization. Our analysis suggests that local search is probably useful for their problems, but is best... |

189 |
The Parallel Genetic Algorithm as Function Optimizer
- Mühlenbein, Schomisch, et al.
- 1991
(Show Context)
Citation Context ... type of GA-LS hybrid. I.C Genetic Algorithms with Local Search Previous experimental results confirm that GA-LS hybrids not only find better solutions than the GA, but also optimize more efficiently =-=[7, 61]-=-. It is noteworthy that these results examine a limited number of algorithmic combinations of the GA with local search. I believe that important algorithmic combinations have been overlooked and that ... |

183 |
An introduction to simulated evolutionary optimization
- Fogel
- 1994
(Show Context)
Citation Context ... evolutionary mechanisms and are often used to perform global optimization. The exemplars of evolutionary search algorithms are genetic algorithms, evolutionary strategie and evolutionary programming =-=[5, 22, 31]-=-. The design and motivation for these algorithms are different, but they incorporate the same basic adaptive components [4, 41]. These methods use a collection of solutions (population of individuals)... |

183 | Very fast simulated re-annealing
- Ingber
- 1989
(Show Context)
Citation Context ...emperatures, the search is often localized to a single basin of attraction for which there is a low probability of escaping in the near term. For this reason, simulated re-annealing has been proposed =-=[44, 45]-=-. This variant treats simulated annealing more like a local search technique, using multiple starts to perform the global search. II.C Evolutionary Search Evolutionary search algorithms, called compet... |

175 | Evolving Networks: Using the Genetic Algorithm with Connectionist Learning
- BELEW, MCINERNEY, et al.
- 1990
(Show Context)
Citation Context ... phenomenon is the Baldwin effect [6, 40] in which learning in natural systems speeds up the rate of evolutionary change. Similar effects have been observed by a number of authors using GA-LS hybrids =-=[7, 40, 50]-=-. The research in this dissertation examines the second type of GA-LS hybrid. I.C Genetic Algorithms with Local Search Previous experimental results confirm that GA-LS hybrids not only find better sol... |

174 | Convergence analysis of canonical genetic algorithm
- Rudolph
- 1994
(Show Context)
Citation Context ...e solutions in their local neighborhood. Using these genetic operators, evolutionary search algorithms perform a 20 global search. Global convergence is not guaranteed for all evolutionary algorithms =-=[79]-=-, but experiments with these algorithms indicate that they often converge to regions of the search space that contain near-optimal solutions. Global convergence is guaranteed for the type of GAs used ... |

153 |
Genetic algorithms and Walsh functions: Part 1, A gentle introduction
- Goldberg
- 1989
(Show Context)
Citation Context ...earch dynamics. In particular, much research has been done examining how crossover composes and disrupts patterns in binary strings, based on their contribution to the total fitness of the individual =-=[30, 85, 86, 97]-=-. This research has motivated the use of modified crossover operators that restrict the distribution of crossover points. For example, if the binary string is decoded into a vector of integers or floa... |

150 |
Local optimization and the traveling salesman problem
- Johnson
- 1990
(Show Context)
Citation Context ...timization problems that has met with empirical success is local (or neighborhood) search." For example, local search methods have proven very successful for the celebrated Traveling Salesman pro=-=blem [47]-=-. A number of authors have performed general analyses of local search methods over discrete spaces. Tovey [92, 93] models the expected performance of local search algorithms that optimize real valued ... |

147 |
A general framework for parallel distributed processing
- Rumelhart, Hinton, et al.
- 1986
(Show Context)
Citation Context ...ted annealing, the optimization method used by Goodwell and Olsen [34]. VII.A Neural Networks Neural networks are simple parametric models that are thought to loosely model biological nervous systems =-=[80]-=-. While there are a variety of types of neural networks, I have examined feedforward neural networks, which perform a deterministic mapping from a set of inputs to a set of outputs [81]. Figure VII.1 ... |

142 | Learning and evolution in neural networks
- Nolfi, Elman, et al.
- 1994
(Show Context)
Citation Context ...itness associated with g. II.D Related Work GAs have been combined with local search methods for a number of different applications. The problem of finding the optimal parameters for a neural network =-=[7, 50, 68]-=- comes closest to the models of learning and evolution. GA-LS hybrids have been applied to combinatorial graph problems like the traveling salesman problem [9, 63, 95] and the graph partitioning probl... |

128 | Selection in massively parallel genetic algorithms
- Collins, Jefferson
- 1991
(Show Context)
Citation Context ...e genetic operators are applied to individuals selected from these subsets. The most common way of structuring the selection mechanism uses a toroidal two dimensional grid like the one in Figure II.4 =-=[2, 12, 56, 87]-=-. Every element of the population is 22 Figure II.4: The two dimensional grid used by GSGAs to define population subsets. assigned to a location on the grid. The grid locations are not necessarily rel... |

125 | Serial and parallel genetic algorithms as function optimizers, in: S. Forrest (Ed
- Gordon, Whitley
- 1993
(Show Context)
Citation Context ...ally structured genetic algorithms (GSGAs). The SIMD GAs examined by McInerney and others are called GSGAs because they spatially structure the adaptive search performed by the GA. Gordon and Whitley =-=[35]-=- have recently argued that the algorithmic nature of GSGAs may be of interest independent of their implementation on a particular architecture and observe that their performance is competitive with ot... |

122 |
A Collection of Test Problems for Constrained Global Optimization Algorithms
- Floudas, Pardalos
- 1990
(Show Context)
Citation Context ...nables us to make comparisons with algorithms developed in the global optimization literature. Most problems in the testbeds used to evaluate GAs and global optimization algorithms are defined on R n =-=[1, 21, 31, 91]-=-. Thus, I evaluate GA-LS hybrids on problems for which we can 42 directly compare my results to other global optimization and evolutionary methods. The experiments in Chapter V and VI perform optimiza... |

121 |
Allocating independent subtasks on parallel processors
- Kruskal, Weiss
- 1985
(Show Context)
Citation Context ... a 91 normal random variable, Y 0 , with meansand standard deviation oe, wheres= P (T f + T gen + T ls ) and oe = T ls q (1 \Gamma )P : Applying the approximation to Y 0 n:n used in Kruskal and Weiss =-=[53]-=-, we have E(T 1 p (k))sO ` kT flop P (T f + T gen + T ls ) + kT flop T ls q 2(1 \Gamma )P log p + kT 1 comm ' : (VI:5) To analyze A 3 , we compare the maximum length of each process independently. Sin... |

115 |
Wets, “Minimization by Random Search Techniques
- Solis, B
- 1981
(Show Context)
Citation Context ...utions fast enough to compensate for the additional expense. Three methods of local search will be used throughout this dissertation. The first is the non-derivative method proposed by Solis and Wets =-=[84]-=-. Next, conjugate gradient methods [26, 74] are used to minimize continuous functions using gradient information. Finally, stochastic approximation is used in pattern recognition methods 10 to find th... |

109 | Evolution in time and space { the parallel genetic algorithm. Foundation of Genetic Algorithms
- Muhlenbein
- 1991
(Show Context)
Citation Context ...n of GSGAs includes GAs which structure the selection at a fine granularity. A number of GAs have been proposed whose competitive selection is intermediate between GSGAs and panmictic GAs. Muhlenbein =-=[63]-=- makes a similar distinction and describes a GA which uses a set of independent subpopulations and structures the inter-population communication with a ladder structure. These subpopulations are typic... |

99 | Genetic algorithms and very fast simulated reannealing: a comparison, Mathematical and Computer Modelling 16
- Ingber, Rosen
- 1992
(Show Context)
Citation Context ...emperatures, the search is often localized to a single basin of attraction for which there is a low probability of escaping in the near term. For this reason, simulated re-annealing has been proposed =-=[44, 45]-=-. This variant treats simulated annealing more like a local search technique, using multiple starts to perform the global search. II.C Evolutionary Search Evolutionary search algorithms, called compet... |

86 |
O.: Evolution algorithms in combinatorial optimization
- Muhlenbein, Georges-Schleuter, et al.
- 1988
(Show Context)
Citation Context ...rs because there are a number of very good heuristics for the local improvement of a solution. Other applications include the mapping problem [64] and molecular conformation problems [49]. Muhlenbein =-=[60, 61, 62]-=-, Ackley [1], and McInerney [56] have developed application-independent versions of the GA for optimization with local search. In most of these applications, the performance of the GA is substantially... |

86 | Dynamic Parameter Encoding for Genetic Algorithms
- Schraudolph, Belew
- 1992
(Show Context)
Citation Context ...on on R n [16]. While binary encodings have been used to successfully solve optimization problems, special manipulation of this encoding is often necessary to increase the efficiency of the algorithm =-=[83, 100]-=-. There is evidence that optimization on R n can and should be performed with real parameters. Goldberg [27] provides formal arguments that floating point GAs manipulate virtual alphabets, a type of s... |

59 |
A massively parallel genetic algorithm: Implementation and first analysis
- Spiessens, Manderick
- 1991
(Show Context)
Citation Context ...e genetic operators are applied to individuals selected from these subsets. The most common way of structuring the selection mechanism uses a toroidal two dimensional grid like the one in Figure II.4 =-=[2, 12, 56, 87]-=-. Every element of the population is 22 Figure II.4: The two dimensional grid used by GSGAs to define population subsets. assigned to a location on the grid. The grid locations are not necessarily rel... |

55 |
Evolution, learning and culture: computational metaphors for adaptive algorithms
- Belew
- 1990
(Show Context)
Citation Context ...A. This type of GA-LS hybrid is particularly interesting because the global and local search methods can influence each other's behavior. An important example of this phenomenon is the Baldwin effect =-=[6, 40]-=- in which learning in natural systems speeds up the rate of evolutionary change. Similar effects have been observed by a number of authors using GA-LS hybrids [7, 40, 50]. The research in this dissert... |

52 |
On solving traveling salesman problems by genetic algorithms
- Braun
- 1991
(Show Context)
Citation Context ... parameters for a neural network [7, 50, 68] comes closest to the models of learning and evolution. GA-LS hybrids have been applied to combinatorial graph problems like the traveling salesman problem =-=[9, 63, 95]-=- and the graph partitioning problem [96]. These problems lend themselves to the use of local search operators because there are a number of very good heuristics for the local improvement of a solution... |

49 |
A computational procedure for determining energetically favorable binding sites on biologically important macromolecules
- Goodford
- 1985
(Show Context)
Citation Context ...ocking conformations were performed using the Autodock software developed by Olson et al. [70]. The conformation energy was 111 evaluated using molecular affinity potentials, as described by Goodford =-=[33]-=-. The macromolecule is imbeded in a three-dimensional grid, and the energy of interaction is calculated for different atom types at every location of the grid. These energies are stored in tables that... |

44 | Extended selection mechanisms in genetic algorithms
- Back, Hoffmeister
- 1991
(Show Context)
Citation Context ...etic algorithms, evolutionary strategie and evolutionary programming [5, 22, 31]. The design and motivation for these algorithms are different, but they incorporate the same basic adaptive components =-=[4, 41]-=-. These methods use a collection of solutions (population of individuals) that are updated iteratively using selection mechanisms and genetic operators. The general process of each iteration (generati... |

42 | A Robust Parallel Programming Model for Dynamic, Non-Uniform Scientific Computation
- Kohn, Baden
- 1994
(Show Context)
Citation Context ... I implemented an GSGA and evaluated its performance on the Intel Paragon at the San Diego Supercomputer Center. The GSGA was implemented using the MP++ and LPARX routines described in Kohn and Baden =-=[52]-=-. The LPARX routines were used to implement the inter-process communication in the globally synchronous GSGA, and were modified to perform inter-process communication in the locally synchronous and as... |

37 |
Papadimitriou and Mihalis Yannakakis. How Easy is Local Search
- Johnson, Christos
- 1988
(Show Context)
Citation Context ...sertation examines methods for unconstrained 8 9 optimization. Theoretical computer science is primarily interested in local search methods over discrete spaces. Johnson, Papadimitriou and Yannakakis =-=[48] observe t-=-hat "One of the few general approaches to difficult combinatorial optimization problems that has met with empirical success is local (or neighborhood) search." For example, local search meth... |

35 |
Stochastic global optimization methods part I: Clustering methods
- Kan, Timmer
- 1987
(Show Context)
Citation Context ...hms that have been used with these methods, including standard hierarchical methods. Clustering methods are amenable to analysis because they use uniformly distributed samples. Rinnooy Kan and Timmer =-=[77, 78]-=- describe a clustering method and describe conditions for which any local minima will be found within a finite number of iterations with probability one. 17 One drawback of cluster methods is that the... |

34 |
Principles of Population Genetics. Sinauer Associates
- Hartl
- 1980
(Show Context)
Citation Context ...even when only a fraction of the population is applying learning methods. The distribution-based methods of adapting the local search frequency are reminiscent of the effects of inbreeding depression =-=[38]-=-, and may be useful for studying the effects of inbreeding on learning in natural systems. Inbreeding depression refers to the detrimental effects of inbreeding, which is indicated by a high F statist... |

31 |
Optimization with Genetic Algorithm Hybrids that Use Local Search’. Adaptive Individuals in Evolving Popnlarions: Models and Algorithms
- Hart, Belew
- 1996
(Show Context)
Citation Context ... and V.5b compare the performance of MC, MS-SW and the three GA-SW hybrids with different frequencies of local search. 1 The results reported here are an extension of those reported in Hart and Belew =-=[37]-=-. 54 1e-20 1e-10 1 1e+10 0 20000 40000 60000 80000 100000 120000 140000 160000 Number of Evaluations MC MS-CG GA GA-CG 0.0625 GA-CG 0.25 GA-CG 1.0 (a) 0.01 0.1 1 10 100 1000 0 20000 40000 60000 80000 ... |

30 |
Simulated annealing - an annotated bibliography
- Collins, Eglese, et al.
- 1988
(Show Context)
Citation Context ...a method of optimization inspired by an analogy between a physical annealing process for obtaining low energy states and the process of solving for minimal solutions to discrete optimization problems =-=[11, 51]-=-. SA sequentially generates random deviates of the current solution that are accepted if a probabilistic test is passed. Suppose x 0 is the current solution and let x 00 be the new deviate. If f(x 00 ... |

25 | Optimizing an Arbitrary Function is Hard for the Genetic Algorithm - Hart, Belew - 1991 |