Results 1  10
of
40
On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts  Towards Memetic Algorithms
, 1989
"... Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could ..."
Abstract

Cited by 186 (10 self)
 Add to MetaCart
Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could possibly enumerate 10 9 tours per second on a computer it would thus take roughly 10 639 years of computing to establish the optimality of this tour by exhaustive enumeration." This quote shows the real difficulty of a combinatorial optimization problem. The huge number of configurations is the primary difficulty when dealing with one of these problems. The quote belongs to M.W Padberg and M. Grotschel, Chap. 9., "Polyhedral computations", from the book The Traveling Salesman Problem: A Guided tour of Combinatorial Optimization [124]. It is interesting to compare the number of configurations of realworld problems in combinatorial optimization with those large numbers arising in Cosmol...
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 126 (1 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
Simulated Annealing Algorithms For Continuous Global Optimization
, 2000
"... INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with the continuity or lower semicontinuity of f , guarantees the existence of the minimum value f . SA algorithms are based on an analogy with a physical phenomenon: while at high temperatures the molecules in a liquid move freely, if the temperature is slowly decreased the thermal mobility of the molecules is lost and they form a pure crystal which also corresponds to a state of minimum energy. If the temperature is decreased too quickly (the so called quenching) a liquid metal rather ends up in a polycrystalline or amorphous state with
Functional Stability Analysis Of Numerical Algorithms
, 1990
"... Contents Table of Contents v List of Tables x List of Figures xi 1. Introduction 1 1.1 Detecting Instability In Numerical Algorithms : : : : : : : : : : 1 1.2 Overview of Functional Stability Analysis : : : : : : : : : : : : : 2 1.3 Results : : : : : : : : : : : : : : : : : : : : : : : : : : : : : ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Contents Table of Contents v List of Tables x List of Figures xi 1. Introduction 1 1.1 Detecting Instability In Numerical Algorithms : : : : : : : : : : 1 1.2 Overview of Functional Stability Analysis : : : : : : : : : : : : : 2 1.3 Results : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 4 1.4 Organization : : : : : : : : : : : : : : : : : : : : : : : : : : : : 5 2. Theoretical Background 7 2.1 Problems and Conditioning : : : : : : : : : : : : : : : : : : : : 8 2.1.1 Definitions : : : : : : : : : : : : : : : : : : : : : : : : : : 8 2.1.2 Problems and Conditioning : : : : : : : : : : : : : : : : 9 2.1.3 Alternative Treatments and Descriptions : : : : : : : : : 12 2.2 Approximations and Stability : : : : : : : : : : : : : : : : : : : 12 2.2.1 Definitions : : : : : : : : : : : : : : : : : : : : : : : : :
Massively Parallel Simulated Annealing and its Relation to Evolutionary Algorithms
 EVOLUTIONARY COMPUTATION
, 1994
"... Simulated annealing and and single trial versions of evolution strategies possess a close relationship when they are designed for optimization over continuous variables. Analytical investigations of their differences and similarities lead to a crossfertilization of both approaches, resulting in new ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Simulated annealing and and single trial versions of evolution strategies possess a close relationship when they are designed for optimization over continuous variables. Analytical investigations of their differences and similarities lead to a crossfertilization of both approaches, resulting in new theoretical results, new parallel population based algorithms, and a better understanding of the interrelationships.
Global optimization by continuous GRASP
 Optimization Letters
"... ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is s ..."
Abstract

Cited by 22 (9 self)
 Add to MetaCart
ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is simple to implement, is widely applicable, and does not make use of derivative information, thus making it a wellsuited approach for solving global optimization problems. We illustrate the effectiveness of the procedure on a set of standard test problems as well as two hard global optimization problems. 1.
A Parallel BuildUp Algorithm for Global Energy Minimizations of Molecular Clusters Using Effective Energy Simulated Annealing
 Journal of Microscopy
, 1993
"... . This work studies the buildup method for the global minimization problem for molecular conformation, especially protein folding. The problem is hard to solve for large molecules using general minimization approaches because of the enormous amount of required computation. We therefore propose a bu ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
. This work studies the buildup method for the global minimization problem for molecular conformation, especially protein folding. The problem is hard to solve for large molecules using general minimization approaches because of the enormous amount of required computation. We therefore propose a buildup process to systematically "construct " the optimal molecular structures. A prototype algorithm is designed using the anisotropic effective energy simulated annealing method at each buildup stage. The algorithm has been implemented on the Intel iPSC/860 parallel computer, and tested with the LennardJones microcluster conformation problem. The experiments showed that the algorithm was effective for relatively large test problems, and also very suitable for massively parallel computation. In particular, for the 72atom LennardJones microcluster, the algorithm found a structure whose energy is lower than any others found in previous studies. Abbreviated title: A Parallel BuildUp Algor...
Global Search Methods For Solving Nonlinear Optimization Problems
, 1997
"... ... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the lear ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the learning of feedforward neural networks, (b) the design of quadraturemirrorfilter digital filter banks, (c) the satisfiability problem, (d) the maximum satisfiability problem, and (e) the design of multiplierless quadraturemirrorfilter digital filter banks. Our method achieves better solutions than existing methods, or achieves solutions of the same quality but at a lower cost.
Global Optimization For Constrained Nonlinear Programming
, 2001
"... In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the firstorder necessary and sufficient condition for CLM dn. To find