### Table I. Parameter values used in the slow recovery model to simulate motoneurons.

### TABLE 1 Parameters of biexponential fits (eq. 1) to the normalized time courses of recovery from ultra-slow inactivation in K1237E (Fig. 1C) H92701, H92702, and A1, A2 are time constants (s) and amplitudes of recovery from slow inactivation and recovery from IUS, respectively. Lidocaine

2003

### Table 1. Summary of the modeling assumptions. The TCP algorithms are Connection Establishment (CE), Initial Slow Start (ISS), Congestion Avoidance (CA), and Fast Recovery (FRC). Exp. indicates that exponential backoff is employed. Model Length TCP algorithms b

2004

"... In PAGE 3: ... 5. ASSUMPTIONS AND MODEL VALIDATION The modeling assumptions for the surveyed models are sum- marized in Table1 . They are classified in three categories [3].... In PAGE 3: ...arized in Table 1. They are classified in three categories [3]. Data Transfer length and congestion control algorithms affect TCP performance. Table1 shows that neither slow start after TO losses nor fast retransmit are considered, al- though TO losses are common [7]. ISS denotes the initial slow start performed after connection establishment.... ..."

Cited by 5

### Table A.1 Congestion Control Algorithm Example This table shows how the four basic congestion control algorithms (slow start, con- gestion avoidance, fast retransmit and fast recovery) work as de ned by Jacobson [JK88].

### Table III presents the results for low TCP Reno back- ground traffic. The table reports the mean throughput for all 2kr = 80 experiments and the effects q of all factors and their interactions on the (mean) throughput. We can compute the average throughput y, e.g., for the configura- tion where all the algorithms in slow-start and congestion avoidance are turned on and all the algorithms in recovery are turned off as follows:

### Table VI presents the results for low TCP Reno background traffic and for low TCP Vegas background traffic. The table re- ports the mean throughput for all 2kr a7 400 experiments and the effects q of all factors and their interactions on the (mean) throughput. We can compute the average throughput y, for ex- ample, for the configuration where all the algorithms in slow- start and congestion avoidance are turned on and all the algo- rithms in recovery are turned off, and TCP Reno is used for background traffic as follows:

2000

Cited by 32

### Table VI presents the results for low TCP Reno background traffic and for low TCP Vegas background traffic. The table re- ports the mean throughput for all 2kr = 400 experiments and the effects q of all factors and their interactions on the (mean) throughput. We can compute the average throughput y,forex- ample, for the configuration where all the algorithms in slow- start and congestion avoidance are turned on and all the algo- rithms in recovery are turned off, and TCP Reno is used for background traffic as follows:

2000

Cited by 32

### Table VI presents the results for low TCP Reno background traffic and for low TCP Vegas background traffic. The table re- ports the mean throughput for all 2kr = 400 experiments and the effects q of all factors and their interactions on the (mean) throughput. We can compute the average throughput y, for ex- ample, for the configuration where all the algorithms in slow- start and congestion avoidance are turned on and all the algo- rithms in recovery are turned off, and TCP Reno is used for background traffic as follows:

### Table 4: The throughput, in packets per second, achieved by

"... In PAGE 6: ... Intermittent noise is modeled as a given probability that eachpacket (regard- less of size) is not received cleanly at its intended desti- nation. Table4 shows the resulting throughput. For the original RTS-CTS-DATA exchange, the dramatic decrease in throughput as the noise level increases is due to the slow recovery at the TCP layer.... ..."

### Table1: Recovery from an initial expenditure contraction

"... In PAGE 13: ... 6. Results For purely descriptive purposes, Table1 gives household recovery times following a drop in measured expenditure. We chose all households who had a decline in their real expenditure between the first two years of the surveys and categorized these households according to the time ... In PAGE 14: ... For the second type the shock appears to have been more devastating, putting them on a declining income path possibly leading to chronic poverty. That interpretation is questionable, however, since there are other ways one might explain Table1 . Possibly the households that had not recovered, experienced other shocks in the intervening period.... In PAGE 14: ... Or the shock may have been transient, but the recursion process is linear with a slow speed of adjustment due to sizable lagged effects of past incomes on current incomes. For these reasons, one cannot conclude from Table1 that short-lived shocks have long- lived impacts. We need to use our model of the dynamics to see if the structural process generating consumption and income is consistent with the type of non-linearity whereby sufficiently large shocks can create long-term poverty.... ..."