Results 1  10
of
58
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 198 (2 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts  Towards Memetic Algorithms
, 1989
"... Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that ..."
Abstract

Cited by 190 (10 self)
 Add to MetaCart
Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could possibly enumerate 10 9 tours per second on a computer it would thus take roughly 10 639 years of computing to establish the optimality of this tour by exhaustive enumeration." This quote shows the real difficulty of a combinatorial optimization problem. The huge number of configurations is the primary difficulty when dealing with one of these problems. The quote belongs to M.W Padberg and M. Grotschel, Chap. 9., "Polyhedral computations", from the book The Traveling Salesman Problem: A Guided tour of Combinatorial Optimization [124]. It is interesting to compare the number of configurations of realworld problems in combinatorial optimization with those large numbers arising in Cosmol...
Synthesis of highperformance analog circuits in astrx/oblx
 IEEE Trans. CAD
, 1996
"... Abstract We present a new synthesis strategy that can automate fully the path from an analog circuit topology and performance specifications to a sized circuit schematic. This strategy relies on asymptotic waveform evaluation to predict circuit performance and simulated annealing to solve a nove ..."
Abstract

Cited by 58 (4 self)
 Add to MetaCart
(Show Context)
Abstract We present a new synthesis strategy that can automate fully the path from an analog circuit topology and performance specifications to a sized circuit schematic. This strategy relies on asymptotic waveform evaluation to predict circuit performance and simulated annealing to solve a novel unconstrained optimization formulation of the circuit synthesis problem. We have implemented this strategy in a pair of tools called ASTRX and OBLX. To show the generality of our new approach, we have used this system to resynthesize essentially all the analog synthesis benchmarks published in the past decade; ASTWOBLX has resynthesized circuits in an afternoon that, for some prior approaches, had required months. To show the viability of the approach on difficult circuits, we have resynthesized a recently published (and patented), highperformance operational amplifier; ASTWOBLX achieved performance comparable to the expert manual design. And finally, to test the limits of the approach on industrialsized problems, we have synthesized the component cells of a pipelined A/D converter; ASTWOBLX successfully generated cells 23 x more complex than those published previously. I.
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
(Show Context)
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
Simulated Annealing Algorithms For Continuous Global Optimization
, 2000
"... INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
(Show Context)
INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with the continuity or lower semicontinuity of f , guarantees the existence of the minimum value f . SA algorithms are based on an analogy with a physical phenomenon: while at high temperatures the molecules in a liquid move freely, if the temperature is slowly decreased the thermal mobility of the molecules is lost and they form a pure crystal which also corresponds to a state of minimum energy. If the temperature is decreased too quickly (the so called quenching) a liquid metal rather ends up in a polycrystalline or amorphous state with
Functional Stability Analysis Of Numerical Algorithms
, 1990
"... Contents Table of Contents v List of Tables x List of Figures xi 1. Introduction 1 1.1 Detecting Instability In Numerical Algorithms : : : : : : : : : : 1 1.2 Overview of Functional Stability Analysis : : : : : : : : : : : : : 2 1.3 Results : : : : : : : : : : : : : : : : : : : : : : : : : : : : : ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
(Show Context)
Contents Table of Contents v List of Tables x List of Figures xi 1. Introduction 1 1.1 Detecting Instability In Numerical Algorithms : : : : : : : : : : 1 1.2 Overview of Functional Stability Analysis : : : : : : : : : : : : : 2 1.3 Results : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 4 1.4 Organization : : : : : : : : : : : : : : : : : : : : : : : : : : : : 5 2. Theoretical Background 7 2.1 Problems and Conditioning : : : : : : : : : : : : : : : : : : : : 8 2.1.1 Definitions : : : : : : : : : : : : : : : : : : : : : : : : : : 8 2.1.2 Problems and Conditioning : : : : : : : : : : : : : : : : 9 2.1.3 Alternative Treatments and Descriptions : : : : : : : : : 12 2.2 Approximations and Stability : : : : : : : : : : : : : : : : : : : 12 2.2.1 Definitions : : : : : : : : : : : : : : : : : : : : : : : : :
Global optimization by continuous GRASP
 Optimization Letters
"... ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is s ..."
Abstract

Cited by 24 (9 self)
 Add to MetaCart
(Show Context)
ABSTRACT. We introduce a novel global optimization method called Continuous GRASP (CGRASP) which extends Feo and Resende’s greedy randomized adaptive search procedure (GRASP) from the domain of discrete optimization to that of continuous global optimization. This stochastic local search method is simple to implement, is widely applicable, and does not make use of derivative information, thus making it a wellsuited approach for solving global optimization problems. We illustrate the effectiveness of the procedure on a set of standard test problems as well as two hard global optimization problems. 1.
Massively Parallel Simulated Annealing and its Relation to Evolutionary Algorithms
 EVOLUTIONARY COMPUTATION
, 1994
"... Simulated annealing and and single trial versions of evolution strategies possess a close relationship when they are designed for optimization over continuous variables. Analytical investigations of their differences and similarities lead to a crossfertilization of both approaches, resulting in new ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Simulated annealing and and single trial versions of evolution strategies possess a close relationship when they are designed for optimization over continuous variables. Analytical investigations of their differences and similarities lead to a crossfertilization of both approaches, resulting in new theoretical results, new parallel population based algorithms, and a better understanding of the interrelationships.
A Parallel BuildUp Algorithm for Global Energy Minimizations of Molecular Clusters Using Effective Energy Simulated Annealing
 Journal of Microscopy
, 1993
"... . This work studies the buildup method for the global minimization problem for molecular conformation, especially protein folding. The problem is hard to solve for large molecules using general minimization approaches because of the enormous amount of required computation. We therefore propose a bu ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
. This work studies the buildup method for the global minimization problem for molecular conformation, especially protein folding. The problem is hard to solve for large molecules using general minimization approaches because of the enormous amount of required computation. We therefore propose a buildup process to systematically "construct " the optimal molecular structures. A prototype algorithm is designed using the anisotropic effective energy simulated annealing method at each buildup stage. The algorithm has been implemented on the Intel iPSC/860 parallel computer, and tested with the LennardJones microcluster conformation problem. The experiments showed that the algorithm was effective for relatively large test problems, and also very suitable for massively parallel computation. In particular, for the 72atom LennardJones microcluster, the algorithm found a structure whose energy is lower than any others found in previous studies. Abbreviated title: A Parallel BuildUp Algor...
Global Search Methods For Solving Nonlinear Optimization Problems
, 1997
"... ... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the lear ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
... these new methods, we develop a prototype, called Novel (Nonlinear Optimization Via External Lead), that solves nonlinear constrained and unconstrained problems in a unified framework. We show experimental results in applying Novel to solve nonlinear optimization problems, including (a) the learning of feedforward neural networks, (b) the design of quadraturemirrorfilter digital filter banks, (c) the satisfiability problem, (d) the maximum satisfiability problem, and (e) the design of multiplierless quadraturemirrorfilter digital filter banks. Our method achieves better solutions than existing methods, or achieves solutions of the same quality but at a lower cost.