### Table 1: Statistics of objective function values. All networks are fully dense.

1993

### Table 2: Comparison between original object and object generated by neural networks.

2004

"... In PAGE 8: ... Three tables are drawn to show the differences between original z-values and estimated z-values using three criterions which are sum, average and standard deviation . Table2 -4 shows the calculated values as comparison. The result proves that 3D reconstruction using neural network is more accurate and compact than using the 3rd order polynomial.... ..."

Cited by 1

### Table 6 - Convolutional interleaving example for N = 5, D = 2

"... In PAGE 141: ... For testing purposes, one-way payload transfer delay shall be less that 10 + (SxD)/4 ms. The performance points in Table6 must be met with a BER of 10 -7 at 6dB margin. The ATU-C and ATU-R shall be connected directly via the specified loop (no home network or phone model present) Table 48 / Proposed European Required Test Loops amp; Performance Targets for G.... ..."

### Table 1: Characterization of Convolution

2006

"... In PAGE 17: ...5% 31.2% Table1 0: Results of Simulated Annealing * to appear in final version CT and convolution observed the best gains. In CT, rapid successive memory accesses generate a heavy load on the OPN, shuffling operands to and from the data tiles.... ..."

Cited by 1

### Table 1: Characterization of Convolution

2006

"... In PAGE 17: ...5% 31.2% Table1 0: Results of Simulated Annealing * to appear in final version CT and convolution observed the best gains. In CT, rapid successive memory accesses generate a heavy load on the OPN, shuffling operands to and from the data tiles.... ..."

Cited by 1

### Table 1. Generating matrices for the constituent convolutional codes.

1997

"... In PAGE 4: ... Consider a rate 1/4 HCCC formed by a parallel four-state recursive systematic convo- lutional code with rate 1/2, where the systematic bits of the parallel encoder (as for turbo codes) are not transmitted; an outer four-state nonrecursive convolutional code with rate 1/2; and an inner four- state recursive systematic convolutional code with rate 2/3, joined by two uniform interleavers of length N1 = N and N2 =2N, where N=20, 40, 100, 200, and 300. The code generator matrices are shown in Table1 . Using Expression (4), we have obtained the bit- error probability curves shown in Fig.... ..."

Cited by 7

### Table 5: Proposal Generator

2003

"... In PAGE 31: ... The symbol p gives the probability of taking some action. The proposal gen- erator is described in Table5 . It is used both in the Bayesian approach as well as in the maximum likelihood approach.... ..."

Cited by 5

### Table 1. Performance of Proposed Predictor

"... In PAGE 5: ...In each set of experiments, we use several thousands of randomly generated dynamic decision networks as indicated in Table1 and Table 2. These test networks use randomly generated conditional probabilities and utilities.... In PAGE 5: ... The next four sets of experiments consider more realistic networks. It is clear from Table1 that the proposed bound performed well in these test cases. More than several thousands randomly generated Time Critical, POMDP and Car Sales networks failed to produce a single case where the formula was under-predicted.... ..."