Results 1  10
of
35
On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts  Towards Memetic Algorithms
, 1989
"... Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could ..."
Abstract

Cited by 185 (10 self)
 Add to MetaCart
Short abstract, isn't it? P.A.C.S. numbers 05.20, 02.50, 87.10 1 Introduction Large Numbers "...the optimal tour displayed (see Figure 6) is the possible unique tour having one arc fixed from among 10 655 tours that are possible among 318 points and have one arc fixed. Assuming that one could possibly enumerate 10 9 tours per second on a computer it would thus take roughly 10 639 years of computing to establish the optimality of this tour by exhaustive enumeration." This quote shows the real difficulty of a combinatorial optimization problem. The huge number of configurations is the primary difficulty when dealing with one of these problems. The quote belongs to M.W Padberg and M. Grotschel, Chap. 9., "Polyhedral computations", from the book The Traveling Salesman Problem: A Guided tour of Combinatorial Optimization [124]. It is interesting to compare the number of configurations of realworld problems in combinatorial optimization with those large numbers arising in Cosmol...
An Overview of Genetic Algorithms: Part 1, Fundamentals
, 1993
"... this article may be reproduced for commercial purposes. 1 Introduction ..."
Abstract

Cited by 79 (1 self)
 Add to MetaCart
this article may be reproduced for commercial purposes. 1 Introduction
Modeling Organizational Adaptation as a Simulated Annealing Process
 Sociological Methods and Research
, 1996
"... Organizations can be characterized as complex systems composed of adaptive and0 intelligentagents. Organizational adaptation occursihmugh restructuring and learning. Organizations can be modeledusing a duallevelmodelin which restructuring is modeled as a simulated annealing process and individual ..."
Abstract

Cited by 54 (21 self)
 Add to MetaCart
Organizations can be characterized as complex systems composed of adaptive and0 intelligentagents. Organizational adaptation occursihmugh restructuring and learning. Organizations can be modeledusing a duallevelmodelin which restructuring is modeled as a simulated annealing process and individual learning is modeled using a stochastic learning model and boundedly rational agents. Such a model is presented, and its behavior is illustrated using a virtual experiment where the type of organizational adaptation is varied. Results suggest that the organizational design and performance relationship nuty be chaotic, despite the simple rules of change. Simple restructuring rules lead to a wide range of emergent organizational structures that increases with individual adaptation. Organizations locate good designs (through chance and slow change) regardless of the agents'intelligence; however emergent designs depend on adaptability. Designfeaturesare notsystematically related toperformance; rather small initial differences in design and environment can affect the emergent behavior
Organizational Adaptation
 Annals of Operations Research
, 1998
"... this document are those of the authors and should not be interpreted as representing the official ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
this document are those of the authors and should not be interpreted as representing the official
Greedy, Prohibition, and Reactive Heuristics for Graph Partitioning
 IEEE Transactions on Computers
, 1998
"... New heuristic algorithms are proposed for the Graph Partitioning problem. A greedy construction scheme with an appropriate tiebreaking rule (MINMAXGREEDY) produces initial assignments in a very fast time. For some classes of graphs, independent repetitions of MINMAXGREEDY are sufficient to rep ..."
Abstract

Cited by 29 (6 self)
 Add to MetaCart
New heuristic algorithms are proposed for the Graph Partitioning problem. A greedy construction scheme with an appropriate tiebreaking rule (MINMAXGREEDY) produces initial assignments in a very fast time. For some classes of graphs, independent repetitions of MINMAXGREEDY are sufficient to reproduce solutions found by more complex techniques. When the method is not competitive, the initial assignments are used as starting points for a prohibitionbased scheme, where the prohibition is chosen in a randomized and reactive way, with a bias towards more successful choices in the previous part of the run. The relationship between prohibitionbased diversification (Tabu Search) and the variabledepth KernighanLin algorithm is discussed. Detailed experimental results are presented on benchmark suites used in the previous literature, consisting of graphs derived from parametric models (random graphs, geometric graphs, etc.) and of "realworld " graphs of large size. On the first series ...
SALSA: A New Approach to Scheduling with Timing Constraints
, 1993
"... This paper describes a new approach to the scheduling problem in highlevel synthesis that meets timing constraints while attempting to minimize hardware resource costs. The approach is based on a modified control/data flow graph (CDFG) representation called SALSA. SALSA provides a simple move set t ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
This paper describes a new approach to the scheduling problem in highlevel synthesis that meets timing constraints while attempting to minimize hardware resource costs. The approach is based on a modified control/data flow graph (CDFG) representation called SALSA. SALSA provides a simple move set that allows alternative schedules to be quickly explored while maintaining timing constraints. It is shown that this move set is complete in that any legal schedule can be reached using some sequence of move applications. In addition, SALSA provides support for scheduling with conditionals, loops, and subroutines. Scheduling with SALSA is performed in two steps. First, an initial schedule that meets timing constraints is generated using a constraint solution algorithm adapted from layout compaction. Second, the schedule is improved using the SALSA move set under control of a simulated annealing algorithm. Results show the scheduler's ability to find good schedules which meet timing constraint...
Differential Evolution for the Optimal Design of Heat Exchangers
"... This paper presents the application of Differential Evolution (DE) for the optimal design of shellandtube heat exchangers. A primary objective in the heat exchanger (HE) design is the estimation of the minimum heat transfer area required for a given heat duty, as it governs the overall cost of t ..."
Abstract

Cited by 13 (10 self)
 Add to MetaCart
This paper presents the application of Differential Evolution (DE) for the optimal design of shellandtube heat exchangers. A primary objective in the heat exchanger (HE) design is the estimation of the minimum heat transfer area required for a given heat duty, as it governs the overall cost of the heat exchanger. However, many number of discrete combinations of the design variables are possible. Hence the design engineer needs an efficient strategy in searching for the global minimum heat exchanger cost. In the present study, for the first time DE, an improved version of Genetic Algorithms (GA), has been successfully applied with 1,61,280 design configurations obtained by varying the design variables: tube outer diameter, tube pitch, tube length, number of tube passes, baffle spacing and baffle cut. Bells method is used to find the heat transfer area for a given design configuration. For a case study taken up, it is observed that DE, an exceptionally simple evolution strategy, is significantly faster compared to GA and is also much more likely to find a functions true global optimum.
An Evaluation of Parallel Simulated Annealing Strategies with Application to Standard Cell Placement
 IEEE Trans. on Comp. Aid. Design of Int. Cir. and Sys
, 1997
"... Simulated annealing, a methodology for solving combinatorial optimization problems, is a very computationally expensive algorithm, and as such, numerous researchers have undertaken efforts to parallelize it. In this paper, we investigate three of these parallel simulated annealing strategies when ap ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
Simulated annealing, a methodology for solving combinatorial optimization problems, is a very computationally expensive algorithm, and as such, numerous researchers have undertaken efforts to parallelize it. In this paper, we investigate three of these parallel simulated annealing strategies when applied to standard cell placement, specifically the TimberWolfSC placement tool. We have examined a parallel moves strategy, as well as two new approaches to parallel cell placement, multiple Markov chains and speculative computation. These algorithms have been implemented in ProperPLACE, our parallel cell placement application, as part of the ProperCAD II project. We have constructed ProperPLACE so that it is portable across a wide range of parallel architectures. Our parallel moves algorithm uses novel approaches to dynamic message sizing, message prioritization, and error control. We show that parallel moves and multiple Markov chains are effective approaches to parallel simulated annealin...
Simulated Annealing based Wireless Sensor Network Localization
"... Abstract — In this paper, we describe a novel localization algorithm for ad hoc wireless sensor networks. Accurate selforganization and localization capability is a highly desirable characteristic of wireless sensor networks. Many researchers have approached the localization problem from different p ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Abstract — In this paper, we describe a novel localization algorithm for ad hoc wireless sensor networks. Accurate selforganization and localization capability is a highly desirable characteristic of wireless sensor networks. Many researchers have approached the localization problem from different perspectives. A major problem in wireless sensor network localization is the flip ambiguity, which introduces large errors in the location estimates. In this paper, we propose a two phase localization method based on the simulated annealing technique to address the issue. Simulated annealing is a technique for combinatorial optimization problems and unlike the gradient search method, it is robust against being trapped into local minima. In this paper we show that our simulated annealing based localization method can be used in ad hoc wireless sensor networks to estimate the location of nodes accurately. In the first phase of our algorithm, simulated annealing is used to obtain an accurate estimate of location. Then a second phase of optimization is performed only on those nodes that are likely to have flip ambiguity problem. Based on the neighborhood information of nodes, those nodes likely to have been affected by flip ambiguity are identified and moved to the correct position. The proposed scheme is tested using simulation on a sensor network of 200 nodes whose distance measurements are corrupted by Gaussian noise. Simulation results show that the proposed novel scheme gives accurate and consistent location estimates of the nodes, and mitigate errors due to flip ambiguity. The performance of the proposed algorithm is better than the performance of some wellknown schemes such as DVhop method and convex optimization based semidefinite programming method. Index Terms — wireless sensor network, localization, flip ambiguity, simulated annealing I.
On Random Minimization of Functions
 Biological Cybernetics
, 1991
"... A nondeterministic minimization algorithm recently proposed is analyzed. Some characteristics are analytically derived from the analysis of positive definite quadratic forms. An improvement is proposed and compared with the basic algorithm. Different variants of the basic algorithm are finally comp ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
A nondeterministic minimization algorithm recently proposed is analyzed. Some characteristics are analytically derived from the analysis of positive definite quadratic forms. An improvement is proposed and compared with the basic algorithm. Different variants of the basic algorithm are finally compared to a standard Conjugate Gradient minimization algorithm in the computation of the Rayleigh coefficient of a positive definite symmetric matrix. 1. Introduction Function minimization is a widespread need in the scientific community. The major shortcoming of the most used minimization algorithms is the sensitivity to local minima. Deterministic methods, which are guaranteed to get the global minima (like those based on interval analysis, see [7]), have a complexity which is exponential in the dimension of the domain, making them often unusable. Methods like simulated annealing promise a reduced sensitivity (see [2], [3]) to local minima but their success depends on the choice of an appro...