### Table 2. Multi-step approximation error

1998

"... In PAGE 10: ....2. Results of experiments. Results of experiments for the one-step and multi- step approximation errors are shown in Table 1 and Table2 respectively. For all cases, the errors induced by the LL method are much smaller than the errors induced by the Euler method.... ..."

Cited by 6

### Table 2: Multi-step forecasting example (T = 2)

"... In PAGE 13: ... The main tuning parameter is the forecast horizon, T , which dictates how far in the future to forecast. To flnd the maximum likelihood measurement sequence, the algorithm must try all possible combinations of future measurement sequences (see the case for the forecast horizon T = 2 in Table2 ), and flnd the measurement sequence that maximizes the likelihood. Recall that q indicates all the times that a target is detected, and r denotes the times that it is not detected.... ..."

### Table 11. Option values for the real options with respect to a multi-strategy project.

### TABLE I : Description of multi-strategy simulated annealing DRASTIC STEP MODEST STEP

1999

### Table 4: General form of multi-step algorithms. x : non-zero coe cient

"... In PAGE 11: ... Appendix A: Multi-step scheme coe cients In this study, three families of schemes are used for the multi-step particle path integration algorithms: Adams-Bashforth, Adams-Moulton, and backwards di erentiation. The general forms for the coe cients are shown in Table4 and the speci c coe cients for the schemes are presented in Tables 5-7. Appendix B: Non-constant timestep algorithms Most of the algorithms which we have discussed for constant timesteps can easily be extended to non- constant timesteps.... ..."

### Table 4. Comparison of Multi-Step Linear Connections between Groups on Post-Interview

in Using Domino and Relational Causality to Analyze Ecosystems: Realizing What Goes Around Comes Around

"... In PAGE 18: ... 16 Multi-Step Linear Connections: The complexity level of the connections made by students on the post-interview shows a clear impact of intervention condition. Table4 shows a comparison of the multi-step linear connections made between groups from the pre- to post-interviews. While the AO group had the highest gain in two-step linear connections (AO = 17; CM = 7; CON = 3), the CM group gained the most in three-step (AO = 2; CM = 7; CON = 1) and four-step (AO = 1; CM = 2; CON = 0) connections.... ..."

### Table 4: General form of multi-step algorithms. x : non-zero coe cient

"... In PAGE 13: ... Appendix A: Multi-step scheme coe cients In this study, three families of schemes are used for the multi-step particle path integration algorithms: Adams-Bashforth, Adams-Moulton, and backwards di erentiation. The general forms for the coe cients are shown in Table4 and the speci c coe cients for the schemes are presented in Tables 5-7. Appendix B: Non-constant timestep algorithms Most of the algorithms which we have discussed for constant timesteps can easily be extended to non- constant timesteps.... ..."

### Table 6: One step RBF algorithm compared to multi-step MSA. (Based on reduced- parameter set.)

1998

"... In PAGE 16: ... Table6 shows the performance of the algorithms for a given SIL misclassi cation cost. For comparison purposes, the results of the 2-step RBF ensemble algorithms are also pro- vided.... ..."

Cited by 7

### Table 6: One step RBF algorithm compared to multi-step MSA. (Based on reduced-parameter set.)

1998

"... In PAGE 15: ... Table6 shows the performance of the algorithms for a given SIL misclassification cost. For comparison purposes, the results of the 2-step RBF ensemble algorithms are also provided.... ..."

Cited by 7

### Table 5.1 SSP multi-step methods #282.14#29

2001

Cited by 1