### Table 9: Large-step optimization: small-steps: simulated annealing; large-steps: 2mach/BB.

1995

"... In PAGE 23: ... Overall the proposed methods to perform the large-step, the best results were obtained for the large-step method 2mach/BB and 2mach/EL. In Table9 we present the results when we applied the large-step optimization methods using the simulated annealing method and the 2mach/BB, where in this case we run just 10 iterations of the method but we let run the simulated annealing method for 1000 iterations. Comparing these results with the respective ones in Tables 6, 7 and 8, we can observe that by running fewer iterations of the large-step optimizations methods and more of the simulated annealing method we did not get better results.... ..."

Cited by 5

### Table 1 . A small training set

1986

"... In PAGE 10: ...o form decision trees for the residual subsets C1, C2, ... Cv. To illustrate the idea, let C be the set of objects in Table1 . Of the 14 objects, 9 are of class P and 5 are of class N, so the information required for classification is Now consider the outlook attribute with values {sunny, overcast, rain}.... In PAGE 12: ... Finally, the collection of case histories will probably include some pa- tients for whom an incorrect diagnosis was made, with consequent errors in the class information provided in the training set. What problems might errors of these kinds pose for the tree-building procedure described earlier? Consider again the small training set in Table1 , and suppose now that attribute outlook of object 1 is incorrectly recorded as overcast. Objects 1 and 3 will then have identical descriptions but belong to different classes, so the attributes become inadequate for this training set.... In PAGE 22: ... The gain ratio criterion selects, from among those attributes with an average-or-better gain, the attribute that maximizes the above ratio. This can be illustrated by returning to the example based on the training set of Table1 . The information gain of the four attributes is given in Section 4 as gain(outlook) = 0.... ..."

Cited by 2605

### Table 2. Number of steps required by RR, RRL and SR for the measure UR#28t#29 for

"... In PAGE 8: ... G =20 G =40 t #28h#29 RR#2FRRL RSD RR#2FRRL RSD 1 56 66 86 99 10 323 355 554 594 100 2,234 2,612 4,187 4,823 1,000 2,708 2,612 5,123 4,823 10,000 2,938 2,612 5,549 4,823 100,000 3,157 2,612 5,957 4,823 We next compare RRL, RR and SR using the example with the measure UR#28t#29. Table2 and Figure 4 give the results. For small t, SR is slightly faster than both RR and RRL.... ..."

### TABLE I. Results for Example I. solved up to t = 0:9 with the splitting algorithm, ROWMAP, and VODPK for required tolerances tol = 10?2; : : : ; 10?8 (if results are not given for a certain value of tol then the code did not return a solution because of too small time steps). method log(tol) steps rej. steps log(l2-error) minifUig cpu time (s)

### Table 1. Required configuration steps, with and without Fresco.

2005

"... In PAGE 7: ... Association between SLA contract and multiple SLA management orders is maintained by the configurator. To better appreciate the benefits of Fresco to the service personnel, we conclude this section by comparing in Table1 the number of Web Service requests required to complete representative tasks with and without Fresco. Notice also that whereas command extensions to the underlying system management products is typically outside the users control, command extensions to Fresco is not, and provides a differentiating ... In PAGE 9: ... The amount of downtime in each calendar month will be totaled to determine any failure to meet the SLA objective. Table1 documents the monthly refunds or premiums associated with missing or exceeding the SLA standard for each calendar month. In no case shall more than the monthly charge be credited for downtime incurred in a single month.... ..."

Cited by 1

### Table 2: Steps required as contention is allowed to increase.

"... In PAGE 8: ...nd verify schedules for meshes of size 4 n02 4, 8 n02 8, ...,32n0232 5 . Table2 shows the improvement possible as the permitted contention is allowed to increase. For each mesh size, the minimum steps possible are n 2 at c = n=4.... In PAGE 13: ... The performance of the naive algorithm, which does not vary with contention, is shown as a series of strips so that the surface of the Bounded algorithm can be seen clearly. The small size of the 4 n02 4 mesh does not permit a collapsible schedule to be generated n28see Table2 n29. Despite this, there is an improvement in performance as contention increases, because the number of synchronization steps required is reduced.... In PAGE 17: ... Note that the communication time is halved going from link contention 1to2. This is because, as shown in Table2 , the number of communication steps drops from 128 to 64 for an 8 n02 8 mesh. Since the lower bound does not include the overheads of node and link contention, the measured time should not drop below this... ..."

### Table 6. Third equation results: Regressions for desired family size, knowledge of contraceptives, determinants of child supply on modernization and cultural variables.

"... In PAGE 25: ...emand and regulation costs. They will be negatively correlated with infant mortality. The other effects are expected to be ambiguous. Table6 presents results for Cd, RC and Cn, which in general are in the expected direction. The results show that they are sensitive to the indicators of modernization.... ..."

### Table 4 number of small pops

1999

"... In PAGE 15: ... The increase in the required number of small steps in an addition chain greatly increases the computing time. Table4 shows how the computing time increases with the number of small steps while (n) is held constant. It also shows how the computing time increases with (n) while holding the number of small steps... ..."

Cited by 8

### Table 1. Evolution of the relative error in the L2{norm (parabolic problem) 10 steps 103 steps 104 steps 105 steps 106 steps

"... In PAGE 14: ... In both examples the reconstruction is much better at the part of the domain where the initial condition is smooth. In Table1 we present the evolution of the iteration error apos;k ? u(0) for the two examples above. Note also that the convergence speed decays exponentially as we iterate.... ..."

### Table 2 Number of steps required to solve all possible permutations for a tree with n = 4 and m = 2. n11 n10 n01 n00 C Steps S Steps Z Steps

2000

"... In PAGE 9: ... For both of this cases the number of success steps is S = 0. Table2 shows the number of success steps for a system with n = 4 and m = 2. As Table 2 shows, the number of success steps is always 2.... In PAGE 9: ... Table 2 shows the number of success steps for a system with n = 4 and m = 2. As Table2 shows, the number of success steps is always 2. The total average number of success steps for any tree of size n with m RTS stations is simply Rule 5 from Table 1, i.... In PAGE 10: ...1) In addition to our example, there are ve other possible ways to distribute two stations with an RTS to send in four positions. Table2 shows all six cases and the number of collision steps Ci associated with each of them. To calculate C(4; 2), we sum each individual permutation and divide this result by six.... In PAGE 11: ...11 RTS to send (m = 2), we can plot a cost table with all the six permutation cases, as shown in Table2 . The number of idle steps at the root node can be expressed as the number of idle steps for the right subtree, plus the number of idle steps for the left subtree.... ..."

Cited by 33