### Table 1: Comparison of the probability distribution for the number of passes to perform an arbitrary BMMC permutation on various sizes of the MasPar MP-2. We compare the probabilities produced by 100,000 random BMMC permutations performed on the MasPar MP-2 to the expected values for these probabilities produced by theoretical analysis. In this table, we measure the percentage of error as jactual?predictedj predicted

1996

"... In PAGE 16: ...Here, we present the equation to calculate the probability for each possible rank of the submatrix A2::5;0::5, which is the leftmost 6 columns of a 4 m matrix with full row rank: Pr frank A2::5;0::5 = kg = F1(m; k) F2(k) F3(m; k) F4(m) ; (1) where F1(m; k) = 2k(m?10+k) ; F2(k) = (26 ? 1) (26?k+1 ? 1) (2k ? 1) (21 ? 1) ; F3(m; k) = (2m?6 ? 1) (2m+k?9 ? 1) (24?k ? 1) (21 ? 1) ; F4(m) = (24 ? 1) (21 ? 1) (2m ? 1) (2m?3 ? 1) : The derivation of equation (1) requires combinatoric theory beyond the scope of this extended abstract. Comparison to empirical results on the MasPar MP-2 In Table1 , we compare experimental data to the theoretical probabilities that an arbitrary BMMC permutation requires a certain number of passes. For each size of MasPar MP-2 (e.... ..."

Cited by 1

### Table 1 shows a quantitative comparison between difierent classiflers. In this table, FRR and RR of classiflers are compared after training them on 150 data points drawn from an arbitrary probability function and tested on the same number of samples drawn from the same distribution. As it can be seen from the above example, the FRR for SVDD classifler is less than that of the other three, while its RR is higher. This proves the superiority of this classifler in case of single class classiflcation over the other three techniques.

2006

"... In PAGE 6: ... In this Figure the test data is composed of 150 samples drawn from the same probability distribution function as the training data, thus should be classifled as the known class. Table1 . Comparison of False Reject Rate and Recall Rate for difierent classiflers.... ..."

Cited by 2

### Table 5 shows a quantitative comparison between difierent classiflers. In this table, FRR and RR of classiflers are compared after training them on 150 data points drawn from an arbitrary probability function and tested on the same number of samples drawn from the same distribution. As it can be seen from the above example, the FRR for SVDDM is less than that of the other three, while its RR is higher. This proves the superiority of this classifler in the case of single class classiflcation over the other three techniques.

"... In PAGE 17: ... In this flgure the test data is composed of 150 samples drawn from the same probability distribution function as the training data, which should be classifled as the known class. Table5 . Comparison of False Reject Rate and Recall Rate for difierent classiflers.... ..."

### Table 3: Prior probabilities for number of fathers from uniform prior. Invariance suggests that only prior distributions which are invariant under re-labeling of the seedlings should be considered. For instance, the point (1,2)(3,4)(5) should have the same prior probability as the point (1,4)(2,5)(3), since the second point can be obtained from the rst point by re-labeling the o spring, and the labeling of the o spring is arbitrary. A characteristic of any such prior distribution is that it assigns the same probabilities to all points which are a representation of the same partition of the integer N (the number of seedlings.) A con guration is a representation of a particular partition of N if the set of cardinalities of the full sibships is equal to the partition of N. (A partition of the integer N is a set of positive integers ( 1; : : :; m) satisfying 1 + : : :+ m = N.) For instance, the point (1,2)(3,4)(5) is a representation of the partition (2; 2; 1) of 5. If the posterior distribution of interest is the representations of partitions, then a sensible non- informative prior would be one which assigns equal prior probability to each representation (and 9

1994

"... In PAGE 9: ...) While this in some ways incorporates the idea of a non- informative prior, it may cause problems when estimating a posterior distribution for the number of fathers, since it will assign very little prior probability to very few or very many fathers, as there are few con gurations in these cases. Table3 shows the prior probability of having n fathers calculated... ..."

Cited by 1

### Table 4: Probability Distribution of IPV4Router Stages

"... In PAGE 5: ... Subsequently, we perform a probability analysis of packet processing in each of these stages. Table4 presents the probability distribution of the IPV4Router application when the processing path is divided into 4 stages. Note that, the selection of the number of the stages is arbitrary, but we must highlight that the results are similar for different number of stages.... ..."

Cited by 1

### Table 1: Compressing an Arbitrary Distributed Algorithm

1990

"... In PAGE 23: ... We also ran P2 on CCC(2) and the compressed version of P2 on CCC(1). Table1 shows the execution times in seconds and the resulting overhead for di erent x and y. One can see that the overhead is even better than the optimal load factor for the corresponding embedding (i.... ..."

Cited by 4

### Table 1: Probability of letters in an average German text (taken from [Beu94]).

2004

"... In PAGE 9: ... Usually this will not be the case. For instance there will be almost no German text fulfilling the distribution given by Table1 exactly, but rather approximately or even worse. To dis- tinguish the probability induced by the model from the real one, we label the former with PM(ai) in order to emphasize the dependency of the model and in order to distinguish from the latter, given by P(ai).... In PAGE 9: ... So we conclude that a model can be seen as an interpretation of an arbitrary dataset. A simple model could for instance be given by the probability distribution of Table1 . This table shows the probabilities of most letters of the German alphabet to occur in an average German text.... ..."