### Table 3: Local Search Algorithms

2000

"... In PAGE 34: ...Table 3: Local Search Algorithms The average number of labels placed by our local search algorithms and the corresponding running times are reported in Table3 . The SA and DS columns refer to simulated annealing and diversified neighbourhood search, respectively, applied to a map labelling formulation.... ..."

Cited by 13

### Table 3: Local Search Algorithms

2000

"... In PAGE 34: ...Table 3: Local Search Algorithms The average number of labels placed by our local search algorithms and the corresponding running times are reported in Table3 . The SA and DS columns refer to simulated annealing and diversi ed neighbourhood search, respectively, applied to a map labelling formulation.... ..."

Cited by 13

### Table 1. Several different strategies

"... In PAGE 3: ...55 algorithm. Table1 indicates several different strategies for our proposed PLGA approach. Notably, MGA_1 uses random initialization and elitist strategy [26]; MGA_2 uses Population Initialization, elitist strategy [26]; MGA_3 uses random initialization, elitist strategy [26], and Local Search; while MGA_4 uses 50% random initialization+50% Population Initialization, Local Search, and elitist strategy [26].... ..."

Cited by 1

### Table 1: Experimental Results for the Local Search Algorithm.

2004

"... In PAGE 12: ... We discuss the impact of this parameter in the next section. Table1 depicts the experimental results on the standard OR Library benchmarks for un- capacitated warehouse location, as well as the M* instances from [21].1 Recall that the M* instances, which capture classes of real UWLPs [21], are very challenging for mathemat- ical programming approaches because they have a large number of suboptimal solutions.... In PAGE 12: ...o note that the algorithm has no prior knowledge of the optimal solution, i.e., it cannot terminate early when the optimum solution is found. As can be seen from Table1 , the algorithm is very robust. It finds optimal solutions with very high frequencies on all benchmarks.... ..."

Cited by 17

### Table 1: Experimental Results for the Local Search Algorithm.

2004

"... In PAGE 10: ...57 NA NA Table 2: Experimental Results of the Genetic Algorithm in [12]. 4 Experimental Results Table1 depicts the experimental results on the standard OR Library benchmarks for unca- pacitated warehouse location, as well as the M* instances generated according to the scheme specified in [12]. Recall that the M* instances, which capture classes of real UWLPs [12], are very challenging for mathematical programming approaches because they have a large number of suboptimal solutions.... ..."

Cited by 17

### Table 1: Experimental Results for the Local Search Algorithm.

2004

"... In PAGE 10: ...57 NA NA Table 2: Experimental Results of the Genetic Algorithm in [12]. 4 Experimental Results Table1 depicts the experimental results on the standard OR Library benchmarks for unca- pacitated warehouse location, as well as the M* instances generated according to the scheme specified in [12]. Recall that the M* instances, which capture classes of real UWLPs [12], are very challenging for mathematical programming approaches because they have a large number of suboptimal solutions.... ..."

Cited by 17

### Table 2: Initialization algorithm for local search.

1992

Cited by 4

### Table 1. Classification of global optimization methods based on the degree of history dependence.

"... In PAGE 2: ... Finally, the small energy difference between the correct and incorrect minima and the exponential growth of the density of the non-native states with energy impose strict requirements on the accuracy of energy evaluation (less than about 1 kcal/mol)5. Numerous approaches have been used to attack the global optimization problem in protein structure prediction, with some success1-8 ( Table1 ). These methods are initially classified according to whether they are deterministic or not; stochastic methods are further subdivided according to the degree of similarity between conformations generated in consecutive iterations of the search algorithm.... In PAGE 3: ... Most of the MC-like stochastic global optimization strategies employ a three-step iteration: (i) modify the current conformation by means of a random move; (ii) evaluate its energy; (iii) accept or reject the new conformation according to an acceptance criterion. The random moves can be ranked by magnitude of change with respect to the current conformation ( Table1 ). The first group contains algorithms in which the generated conformations do not depend on the previous ones.... ..."

### Table 1: Results for Local Search

2005

"... In PAGE 13: ... The local search algorithm starts at a random binary vector and reaches a local maximum in the binary neighborhood by successively moving to the flrst improving neighbor found. Table1 presents the results that were obtained, where average and the maximum objective function value obtained starting from ten random binary vectors are shown for formulations (2) and (6). Matlabr function fmincon uses a sequential quadratic programming ap- proach for solving medium-scale constrained optimization problems.... ..."