Results 1  10
of
242
Differential Evolution  A simple and efficient adaptive scheme for global optimization over continuous spaces
, 1995
"... A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented. By means of an extensive testbed, which includes the De Jong functions, it will be demonstrated that the new method converges faster and with more certainty than Adaptive Simula ..."
Abstract

Cited by 411 (5 self)
 Add to MetaCart
A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented. By means of an extensive testbed, which includes the De Jong functions, it will be demonstrated that the new method converges faster and with more certainty than Adaptive Simulated Annealing as well as the Annealed Nelder&Mead approach, both of which have a reputation for being very powerful. The new method requires few control variables, is robust, easy to use and lends itself very well to parallel computation. ________________________________________ 1) International Computer Science Institute, 1947 Center Street, Berkeley, CA 947041198, Suite 600, Fax: 5106437684. Email: storn@icsi.berkeley.edu. On leave from Siemens AG, ZFE T SN 2, OttoHahn Ring 6, D81739 Muenchen, Germany. Fax: 0114963644577, Email: rainer.storn@zfe.siemens.de. 2) 836 Owl Circle, Vacaville, CA 95687, kprice@solano.community.net. Introduction Problems which involve global optimiz...
Learning LongTerm Dependencies with Gradient Descent is Difficult
 TO APPEAR IN THE SPECIAL ISSUE ON RECURRENT NETWORKS OF THE IEEE TRANSACTIONS ON NEURAL NETWORKS
"... Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in th ..."
Abstract

Cited by 379 (35 self)
 Add to MetaCart
Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in the input/output sequences span long intervals. We showwhy gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases. These results expose a tradeoff between efficient learning by gradient descent and latching on information for long periods. Based on an understanding of this problem, alternatives to standard gradient descent are considered.
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 281 (2 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
Simulated annealing: Practice versus theory
 Mathl. Comput. Modelling
, 1993
"... this paper "ergodic" is used in a very weak sense, as it is not proposed, theoretically or practically, that all states of the system are actually to be visited ..."
Abstract

Cited by 218 (18 self)
 Add to MetaCart
this paper "ergodic" is used in a very weak sense, as it is not proposed, theoretically or practically, that all states of the system are actually to be visited
A Genetic Algorithm for Function Optimization: A Matlab Implementation
, 1996
"... A genetic algorithm implemented in Matlab is ..."
Genetic Algorithms And Very Fast Simulated Reannealing: A Comparison
, 1992
"... We compare Genetic Algorithms (GA) with a functional search method, Very Fast Simulated Reannealing (VFSR), that not only is efficient in its search strategy, but also is statistically guaranteed to find the function optima. GA previously has been demonstrated to be competitive with other standard B ..."
Abstract

Cited by 120 (16 self)
 Add to MetaCart
We compare Genetic Algorithms (GA) with a functional search method, Very Fast Simulated Reannealing (VFSR), that not only is efficient in its search strategy, but also is statistically guaranteed to find the function optima. GA previously has been demonstrated to be competitive with other standard Boltzmanntype simulated annealing techniques. Presenting a suite of six standard test functions to GA and VFSR codes from previous studies, without any additional fine tuning, strongly suggests that VFSR can be expected to be orders of magnitude more efficient than GA.
Adaptive simulated annealing (ASA): Lessons learned
 Control and Cybernetics
, 1996
"... Adaptive simulated annealing (ASA) is a global optimization algorithm based on an associated proof that the parameter space can be sampled much more efficiently than by using other previous simulated annealing algorithms. The author's ASA code has been publicly available for over two years. ..."
Abstract

Cited by 93 (13 self)
 Add to MetaCart
Adaptive simulated annealing (ASA) is a global optimization algorithm based on an associated proof that the parameter space can be sampled much more efficiently than by using other previous simulated annealing algorithms. The author's ASA code has been publicly available for over two years. During this time the author has volunteered to help people via email, and the feedback obtained has been used to further develop the code.
Global Optimization for Neural Network Training
 IEEE Computer
, 1996
"... In this paper, we study various supervised learning methods for training feedforward neural networks. In general, such learning can be considered as a nonlinear global optimization problem in which the goal is to minimize a nonlinear error function that spans the space of weights using heuristic st ..."
Abstract

Cited by 53 (12 self)
 Add to MetaCart
(Show Context)
In this paper, we study various supervised learning methods for training feedforward neural networks. In general, such learning can be considered as a nonlinear global optimization problem in which the goal is to minimize a nonlinear error function that spans the space of weights using heuristic strategies that look for global optima (in contrast to local optima). We survey various global optimization methods suitable for neuralnetwork learning, and propose the NOVEL method, a novel global optimization method for nonlinear optimization and neural network learning. By combining global and local searches, we show how NOVEL can be used to find a good local minimum in the error space. Our key idea is to use a userdefined trace that pulls a search out of a local minimum without having to restart it from a new starting point. Using five benchmark problems, we compare NOVEL against some of the best global optimization algorithms and demonstrate its superior improvement in performance. 1 In...
Adaptive Simulated Annealing for Optimization in Signal Processing Applications
, 1999
"... Many signal processing applications pose optimization problems with multimodal and nonsmooth cost functions. Gradient methods are ineffective in these situations. The adaptive simulated annealing (ASA) offers a viable optimization tool for tackling these difficult nonlinear optimization problems. Th ..."
Abstract

Cited by 50 (30 self)
 Add to MetaCart
Many signal processing applications pose optimization problems with multimodal and nonsmooth cost functions. Gradient methods are ineffective in these situations. The adaptive simulated annealing (ASA) offers a viable optimization tool for tackling these difficult nonlinear optimization problems. Three applications, maximum likelihood (ML) joint channel and data estimation, infiniteimpulseresponse (IIR) filter design and evaluation of minimum symbolerrorrate (MSER) decision feedback equalizer (DFE), are used to demonstrate the effectiveness of the ASA. Keywords. Simulated annealing, global optimization, blind equalization, IIR filter, decision feedback equalizer. 1 Introduction Optimization problems with multimodal and/or nonsmooth cost functions are commonly encountered in signal processing applications. Conventional gradientbased algorithms are ineffective in these applications due to the problem of local minima or the difficulty in calculating gradients. Optimization method...
Simulated Annealing Algorithms For Continuous Global Optimization
, 2000
"... INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with ..."
Abstract

Cited by 47 (1 self)
 Add to MetaCart
(Show Context)
INTRODUCTION In this paper we consider Simulated Annealing algorithms (SA in what follows) applied to continuous global optimization problems, i.e. problems with the following form f = min x2X f(x); (1.1) where X ` ! n is a continuous domain, often assumed to be compact, which, combined with the continuity or lower semicontinuity of f , guarantees the existence of the minimum value f . SA algorithms are based on an analogy with a physical phenomenon: while at high temperatures the molecules in a liquid move freely, if the temperature is slowly decreased the thermal mobility of the molecules is lost and they form a pure crystal which also corresponds to a state of minimum energy. If the temperature is decreased too quickly (the so called quenching) a liquid metal rather ends up in a polycrystalline or amorphous state with