### Table 4. Comparison of decoder performance for lossy coding (1 tile, SNR progressive, 1 layer, 5 decomposition levels)

"... In PAGE 4: ... Thus, one might ex- pect thecoding performance of the JasPerdecoder to be worse than that of the VM decoder. Table4 gives the results for several pairs of test images and bit rates. Examining the numbers, we observe that both decoders yield comparable results.... ..."

### Table 4. Comparison of decoder performance for lossy coding (1 tile, SNR progressive, 1 layer, 5 decomposition levels)

"... In PAGE 4: ... Thus, one might ex- pect thecoding performance of the JasPerdecoder to be worse than that of the VM decoder. Table4 gives the results for several pairs of test images and bit rates. Examining the numbers, we observe that both decoders yield comparable results.... ..."

### Table 1: PSNR and MSE of the noisy image, and the noisy image filtered with standard and conditional median filter with respect to the original image.

### Table 3 Misclassified pixels in noisy image

2005

### Table 5. E ects of the JBIG progressive encoding method

"... In PAGE 7: ...4 E ects of progressive coding Additional test data sets consisting of lower resolution images, 150 dpi, 75 dpi, and 38 dpi, generated by the JBIG progressive encoding method were processed with the JBIG skew estimation algorithm. Table5 summa- rizes their e ects of on the performance of the algorithm, and Fig. 15 shows the con dence intervals for MSE.... ..."

### Table II: List of errors and their e ects in a noisy channel.

### Table 2. Simulated error rate of .00005 for an ECC using a noisy channel. The design uses a [3,1] Hamming code, flips one qubit, and has generalized amplitude damping noise with 1 % chance of damping.

2004

Cited by 3

### Table 2. Simulated error rate of .00005 for an ECC using a noisy channel. The design uses a [3,1] Hamming code, ips one qubit, and has generalized amplitude damping noise with 1 % chance of damping.

### Table 2: MAEs for noisy and enhanced images #28Lena#29

"... In PAGE 4: ... The windows used in the simulations are all 3#023 square windows. Table2 shows the MAEs of the noisy and enhanced images. The above experiment is repeated with another test image #5CHarbor quot; #28512 #02 512#29 which contains many #0Cne structures.... ..."

### Table 1: Results for Figure 2 (Mickey image with 15% channel noise). f, k and p are de ned in the text.

1999

"... In PAGE 17: ... The performance of our Bayesian morphology algorithm is the same as that of an un- supervised version of ICM. In Table1 , we give the nal estimates of p and w, the number of iterations before convergence and the number of calls to the function f used in (24).... In PAGE 18: ...pdating. Thus, in this case the number of calls to f can be divided by 2. Although compu- tationally unnecessarily, the distinction is interesting when comparing maximum likelihood and maximum pseudo-likelihood estimators of since a smaller value of w corresponds to a larger value of . In Table1 , the pseudo-likelihood criterion results in an estimate of w equal to 3, instead of 4 when a likelihood criterion is used. For binary images, those values of w correspond to the same updating of the current restoration, so that for the noisy image (a), likelihood and pseudo-likelihood criteria lead to similar restorations ((b) and (c)).... ..."

Cited by 3