Results 1  10
of
60
Evolutionary computation: Comments on the history and current state
 IEEE Transactions on Evolutionary Computation
, 1997
"... Abstract — Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general struc ..."
Abstract

Cited by 207 (0 self)
 Add to MetaCart
Abstract — Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and the working principles of different approaches, including genetic algorithms (GA) [with links to genetic programming (GP) and classifier systems (CS)], evolution strategies (ES), and evolutionary programming (EP) by analysis and comparison of their most important constituents (i.e., representations, variation operators, reproduction, and selection mechanism). Finally, we give a brief overview on the manifold of application domains, although this necessarily must remain incomplete. Index Terms — Classifier systems, evolution strategies, evolutionary computation, evolutionary programming, genetic algorithms,
On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts  Towards Memetic Algorithms
, 1989
"... Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could ..."
Abstract

Cited by 186 (10 self)
 Add to MetaCart
Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could possibly enumerate 10 9 tours per second on a computer it would thus take roughly 10 639 years of computing to establish the optimality of this tour by exhaustive enumeration." This quote shows the real difficulty of a combinatorial optimization problem. The huge number of configurations is the primary difficulty when dealing with one of these problems. The quote belongs to M.W Padberg and M. Grotschel, Chap. 9., "Polyhedral computations", from the book The Traveling Salesman Problem: A Guided tour of Combinatorial Optimization [124]. It is interesting to compare the number of configurations of realworld problems in combinatorial optimization with those large numbers arising in Cosmol...
On the Convergence of Pattern Search Algorithms
"... . We introduce an abstract definition of pattern search methods for solving nonlinear unconstrained optimization problems. Our definition unifies an important collection of optimization methods that neither computenor explicitly approximate derivatives. We exploit our characterization of pattern sea ..."
Abstract

Cited by 149 (14 self)
 Add to MetaCart
. We introduce an abstract definition of pattern search methods for solving nonlinear unconstrained optimization problems. Our definition unifies an important collection of optimization methods that neither computenor explicitly approximate derivatives. We exploit our characterization of pattern search methods to establish a global convergence theory that does not enforce a notion of sufficient decrease. Our analysis is possible because the iterates of a pattern search method lie on a scaled, translated integer lattice. This allows us to relax the classical requirements on the acceptance of the step, at the expense of stronger conditions on the form of the step, and still guarantee global convergence. Key words. unconstrained optimization, convergence analysis, direct search methods, globalization strategies, alternating variable search, axial relaxation, local variation, coordinate search, evolutionary operation, pattern search, multidirectional search, downhill simplex search AMS(M...
Optimization by direct search: New perspectives on some classical and modern methods
 SIAM Review
, 2003
"... Abstract. Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because t ..."
Abstract

Cited by 126 (14 self)
 Add to MetaCart
Abstract. Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
Direct Search Methods On Parallel Machines
 SIAM Journal on Optimization
, 1991
"... . This paper describes an approach to constructing derivativefree algorithms for unconstrained optimization that are easy to implement on parallel machines. A special feature of this approach is the ease with which algorithms can be generated to take advantage of any number of processors and to ada ..."
Abstract

Cited by 111 (22 self)
 Add to MetaCart
. This paper describes an approach to constructing derivativefree algorithms for unconstrained optimization that are easy to implement on parallel machines. A special feature of this approach is the ease with which algorithms can be generated to take advantage of any number of processors and to adapt to any cost ratio of communication to function evaluation. Numerical tests show speedups on two fronts. The cost of synchronization being minimal, the speedup is almost linear with the addition of more processors, i.e., given a problem and a search strategy, the decrease in execution time is proportional to the number of processors added. Even more encouraging, however, is that different search strategies, devised to take advantage of additional (or more powerful) processors, may actually lead to dramatic improvements in the performance of the basic algorithm. Thus search strategies intended for many processors actually may generate algorithms that are better even when implemented seque...
An Overview of Evolutionary Computation
, 1993
"... Evolutionary computation uses computational models of evolutionary processes as key elements in the design and implementation of computerbased problem solving systems. In this paper we provide an overview of evolutionary computation, and describe several evolutionary algorithms that are current ..."
Abstract

Cited by 106 (5 self)
 Add to MetaCart
Evolutionary computation uses computational models of evolutionary processes as key elements in the design and implementation of computerbased problem solving systems. In this paper we provide an overview of evolutionary computation, and describe several evolutionary algorithms that are currently of interest. Important similarities and differences are noted, which lead to a discussion of important issues that need to be resolved, and items for future research.
Direct Search Methods: Once Scorned, Now Respectable
 In Numerical Analysis 1995 (Proceedings of the 1995 Dundee Biennial Conference in Numerical Analysis
, 1995
"... The need to optimize a function whose derivatives are unknown or nonexistent arises in many contexts, particularly in realworld applications. Various direct search methods, most notably the NelderMead `simplex' method, were proposed in the early 1960s for such problems, and have been enormously p ..."
Abstract

Cited by 83 (3 self)
 Add to MetaCart
The need to optimize a function whose derivatives are unknown or nonexistent arises in many contexts, particularly in realworld applications. Various direct search methods, most notably the NelderMead `simplex' method, were proposed in the early 1960s for such problems, and have been enormously popular with practitioners ever since. Nonetheless, for more than twenty years these methods were typically dismissed or ignored in the mainstream optimization literature, primarily because of the lack of rigorous convergence results. Since 1989, however, direct search methods have been rejuvenated and made respectable. This paper summarizes the history of direct search methods, with special emphasis on the NelderMead method, and describes recent work in this area. This paper is based on a plenary talk given at the Biennial Dundee Conference on Numerical Analysis, Dundee, Scotland, 1995. 1. Introduction Unconstrained optimizationthe problem of minimizing a nonlinear function f(x) for x 2...
Realcoded Genetic Algorithms, Virtual Alphabets, and Blocking
 Complex Systems
, 1990
"... This paper presents a theory of convergence for realcoded genetic algorithmsGAs that use floatingpoint or other highcardinality codings in their chromosomes. The theory is consistent with the theory of schemata and postulates that selection dominates early GA performance and restricts subseque ..."
Abstract

Cited by 78 (7 self)
 Add to MetaCart
This paper presents a theory of convergence for realcoded genetic algorithmsGAs that use floatingpoint or other highcardinality codings in their chromosomes. The theory is consistent with the theory of schemata and postulates that selection dominates early GA performance and restricts subsequent search to intervals with aboveaverage function value, dimension by dimension. These intervals may be further subdivided on the basis of their attraction under genetic hillclimbing. Each of these subintervals is called a virtual character, and the collection of characters along a given dimension is called a virtual alphabet. It is the virtual alphabet that is searched during the recombinative phase of the genetic algorithm, and in many problems this is sufficient to ensure that good solutions are found. Although the theory helps suggest why many problems have been solved using realcoded GAs, it also suggests that realcoded GAs can be blocked from further progress in those situations whe...
Pattern Search Algorithms for Bound Constrained Minimization
 ICASE, NASA LANGLEY RESEARCH
, 1996
"... We present a convergence theory for pattern search methods for solving bound constrained nonlinear programs. The analysis relies on the abstract structure of pattern search methods and an understanding of how the pattern interacts with the bound constraints. This analysis makes it possible to devel ..."
Abstract

Cited by 70 (16 self)
 Add to MetaCart
We present a convergence theory for pattern search methods for solving bound constrained nonlinear programs. The analysis relies on the abstract structure of pattern search methods and an understanding of how the pattern interacts with the bound constraints. This analysis makes it possible to develop pattern search methods for bound constrained problems while only slightly restricting the flexibility present in pattern search methods for unconstrained problems. We prove global convergence despite the fact that pattern search methods do not have explicit information concerning the gradient and its projection onto the feasible region and consequently are unable to enforce explicitly a notion of sufficient feasible decrease.
Direct search methods: then and now
, 2000
"... We discuss direct search methods for unconstrained optimization. We give a modern perspective on this classical family of derivativefree algorithms, focusing on the development of direct search methods during their golden age from 1960 to 1971. We discuss how direct search methods are characterized ..."
Abstract

Cited by 66 (4 self)
 Add to MetaCart
We discuss direct search methods for unconstrained optimization. We give a modern perspective on this classical family of derivativefree algorithms, focusing on the development of direct search methods during their golden age from 1960 to 1971. We discuss how direct search methods are characterized by the absence of the construction of a model of the objective. We then consider a number of the classical direct search methods and discuss what research in the intervening years has uncovered about these algorithms. In particular, while the original direct search methods were consciously based on straightforward heuristics, more recent analysis has shown that in most — but not all — cases these heuristics actually