### Table 2. The search efficiency in the neighbor overlay.

2004

"... In PAGE 10: ...it can be resolved in CWBDB7CY hops and reach the destination peer in CWBDB7CYB7BD hops with a high probability. Table2 shows the the number of peers touched, the number of peers foreseen, and the number of messages produced at each hop along the neighbor links. As long as the shortest distance between the query source peer and the pre- destination peer that has a successful local matching is not longer than CWBD hops in the friend overlay plus CWBE hops in the neighbor overlay, this query is satisfied by our algorithm.... ..."

Cited by 4

### Table 2. Sample profiles of nearest-neighbors search algorithm.

1999

"... In PAGE 4: ... Figure 5 illustrates the graceful computation degradation capability of the nearest-neighbors search algorithm for the sequence FOREMAN. The figure shows the rate-distortion performance of the full search algorithm and the nearest- neighbors search algorithm, for the basic, low-computation, and high-quality profiles outlined in Table2 . Clearly, even for difficult sequences, the high-quality profile can achieve a performance level that is close to that of the full search 20 25 30 35 40 45 50 55 60 65 30 30.... ..."

Cited by 12

### Table 1: Results for Local Search

2005

"... In PAGE 13: ... The local search algorithm starts at a random binary vector and reaches a local maximum in the binary neighborhood by successively moving to the flrst improving neighbor found. Table1 presents the results that were obtained, where average and the maximum objective function value obtained starting from ten random binary vectors are shown for formulations (2) and (6). Matlabr function fmincon uses a sequential quadratic programming ap- proach for solving medium-scale constrained optimization problems.... ..."

### Table 1. Classification of global optimization methods based on the degree of history dependence.

"... In PAGE 2: ... Finally, the small energy difference between the correct and incorrect minima and the exponential growth of the density of the non-native states with energy impose strict requirements on the accuracy of energy evaluation (less than about 1 kcal/mol)5. Numerous approaches have been used to attack the global optimization problem in protein structure prediction, with some success1-8 ( Table1 ). These methods are initially classified according to whether they are deterministic or not; stochastic methods are further subdivided according to the degree of similarity between conformations generated in consecutive iterations of the search algorithm.... In PAGE 3: ... Most of the MC-like stochastic global optimization strategies employ a three-step iteration: (i) modify the current conformation by means of a random move; (ii) evaluate its energy; (iii) accept or reject the new conformation according to an acceptance criterion. The random moves can be ranked by magnitude of change with respect to the current conformation ( Table1 ). The first group contains algorithms in which the generated conformations do not depend on the previous ones.... ..."

### Table 1: Motion vector prediction is based on the coding modes of the neighboring macroblocks.

"... In PAGE 3: ... For example, if all three neighboring macroblocks have been intra coded, the target macroblock likely belongs to an area with non- motion changes. Therefore, our nearest-neighbors search algorithm employs the prediction method illustrated in Table1 . The table shows that the predicted motion vector is derived from previously coded motion vectors of inter coded macroblocks of the macroblocks shown in Figure 2.... In PAGE 4: ... Thus, the search path consists of an orderly sequence of search centers. Such a path is abandoned when motion vector prediction is very unreliable as indicated in Table1 , and any path with a more extensive range can instead be employed. In this work, the well-known three-step search path appears to be a good alternative.... ..."

### Table 1. The best instance based learning parameters found by our genetic algorithm searches.

2002

"... In PAGE 4: ... Lomax had a large number of jobs submitted in June so we only used the first week of data from June and evaluated the parameters using the accuracy of the predictions in this week of data. The best parameters we found are shown in Table1 . The two obvious trends are that the number of nearest neighbors is relatively small and the feature weight for the number of CPUs is relatively high.... ..."

Cited by 2

### Table 5: Detailed results for improved curvilinear search algorithm

2006

Cited by 2

### Table 4. Search

"... In PAGE 5: ... However, if there is no difference in relative distances between obstacles when comparing different sized search spaces, the obstacle distribution will appear to be identical for both search spaces, which may affect the accuracy of the complexity index calculation. Category Search Space Algorithm #1 Algorithm #2 Algorithm #3 1 X X X Easy 2 X X X 3 X X X 4 Moderate 5 X X 6 7 X Difficult 8 9 Very Difficult 10 Table4 . Algorithms Comparison We also need to improve the turn factor calculation.... ..."