### Table 3-1. Statistical Accuracy of the different prediction algorithms in terms of Mean Absolute Error (MAE) with respect to different Sparsity levels Sparsity Levels

2005

"... In PAGE 56: ... MAE has been computed for different prediction algorithms and for different levels of Sparsity. Table3 -1 provides values for the MAE of the different prediction algorithms presented, while Figure 3-2 illustrates the sensitivity of the algorithms in relation to the different levels of sparsity applied. ... In PAGE 59: ... The area under the curve represents how much sensitive the prediction algorithm is, so the more area it covers the better for the prediction algorithm. Figure 3-3 illustrates the sensitivity of the different prediction algorithms, while Table3 -2 provides specific values for the ROC-6, ROC-7, ROC-8 and ROC-9 which are of greatest interest. We consider these specific points in ROC curve of greatest interest, because typically an item is considered as good if its average rating is over 6, 7, 8, or 9 in a 1-10 rating scale.... In PAGE 59: ... We consider these specific points in ROC curve of greatest interest, because typically an item is considered as good if its average rating is over 6, 7, 8, or 9 in a 1-10 rating scale. Table3 -2. Decision Support Accuracy of the different prediction algorithms in terms of Receiver Operating Curve (ROC) with respect to different Sparsity levels Quality Threshold ROC-6 ROC-7 ROC-8 ROC-9 CFUB-ER 0.... ..."

### Table 1. Statistical Accuracy of the different prediction algorithms in terms of Mean Absolute Error (MAE) with respect to different Sparsity levels

2004

"... In PAGE 10: ... MAE has been computed for different prediction algorithms and for different levels of Sparsity. Table1 provides values for the MAE of the different prediction algorithms presented, while Figure 2 illustrates the sensitivity of the algorithms in relation to the different levels of sparsity applied. As far as statistical accuracy is concerned, the following outcomes about the quality performance of the prediction algorithms are reached.... ..."

Cited by 4

### Table 1 Index terms list

2005

"... In PAGE 13: ... The results are shown in Table 11. In Table1 0, the results indicate that the proposed algorithm outperforms the other two algorithms in all situations. Among these three algorithms, ANN outper- forms SW, and SW outperforms GM.... ..."

### Table 1. (cont.) Sporadic Examples of Zw of Index 1 I 10 Index w

"... In PAGE 19: ...Table1 . Sporadic Examples of Zw of Index 1 I 10 Index w Monomials of fw d b2 K-E 1 (1,2,3,5) z10 0 ; z5 1; z3 2z1; z2 3; : : : (17) 10 9 ? 1 (1,3,5,7) z15 0 ; z5 1; z3 2; z2 3z0; : : : (19) 15 9 ? 1 (1,3,5,8) z16 0 ; z5 1z0; z3 2z0; z2 3; : : : (20) 16 10 ? 1 (2,3,5,9) z9 0; z6 1; z3 2z1; z2 3; : : : (13) 18 7 Y 1 (3,3,5,5) g5(z0; z1); f3(z2; z3) 15 5 Y 1 (3,5,7,11) z6 0z2; z5 1; z2 2z3; z2 3z0; : : : (8) 25 5 Y 1 (3,5,7,14) z7 0z2; z5 1z0; g2(z2 2; z3); : : : (9) 28 6 Y 1 (3,5,11,18) g2(z6 0; z3); z5 1z2; z3 2z0; : : : (10) 36 6 Y 1 (5,14,17,21) z7 0z3; z4 1; z3 2z0; z2 3z1; z5 0z1z2 56 4 Y 1 (5,19,27,31) z10 0 z3; z4 1z0; z3 2; z2 3z1; z7 0z1z2 81 3 Y 1 (5,19,27,50) z20 0 ; z10 0 z3; z2 3; z5 1z0; z3 2z1; z7 0z2 1z3 100 4 Y 1 (7,11,27,37) z10 0 z1; z4 1z3; z3 2; z3 3z0 81 3 Y 1 (7,11,27,44) z11 0 z1; z3 2z0; z8 1; z4 1z3; z2 3; z4 0z3 1z2 88 4 Y 1 (9,15,17,20) z5 0z1; z4 1; z3 2z0; z3 3 60 3 Y 1 (9,15,23,23) z6 0z1; z4 1z0; z3 2; z2 2z3; z3; z2z2 3; z3 3 69 5 Y 1 (11,29,39,49) z8 0z2; z4 1z0; z3 2; z2 3z1 127 3 Y 1 (11,49,69,128) z17 0 z2; z5 1z0; z4 2; z2 2z3; z2 3 256 2 Y 1 (13,23,35,57) z8 0z1; z4 1z2; z2 2z3; z2 3z0 127 3 Y 1 (13,35,81,128) z17 0 z1; z5 1z2; z3 2z0; z2 3 256 2 Y 2 (2,3,4,5) z6 0; z4 1; z3 2; z2 3z0; : : : (10) 12 5 ? 2 (2,3,4,7) z7 0; z4 1z0; z3 2z0; z2 3; : : : (11) 14 6 ? 2 (3,4,5,10) z5 0z2; z5 1; z4 2; z2 3; : : : (9) 20 5 Y 2 (3,4,6,7) g3(z2 0; z2); z3 1z2; z2 3z1; z0z3z2 1; z2 0z3 1 18 6 ? 2 (3,4,10,15) z10 0 ; z5 1z3; z3 2; z2 3; : : : (10) 30 7 Y 2 (3,7,8,13) z7 0z2; z3 1z2; z2 2z3; z2 3z0; z5 0z1; z3 0z1z3; z2 0z1z2 2 29 5 ? 2 (3,10,11,19) z10 0 z3; z3 1z2; z2 2z3; z2 3z0; z7 0z2 1; z4 0z1z3; z3 0z1z3 2 41 5 ? 2 (5,13,19,22) z7 0z3; z4 1z0; z3 2; z2 3z1; z5 0z1z2 57 3 Y 2 (5,13,19,35) z14 0 ; z7 0z3; z2 3; z5 1z0; z3 2z1; z5 0z2 1z2 70 3 Y 2 (6,9,10,13) z6 0; z4 1; z3 2z0; z2 3z2; z3 0z2 1 36 4 Y 2 (7,8,19,25) z7 0z1; z4 1z3; z3 2; z2 3z0; z2 0z3 1z2 57 3 Y 2 (7,8,19,32) z8 0z1; z8 1; z4 1z3; z2 3; z3 2z0; z0z3 2; z3 0z3 1z2 64 4 Y 2 (9,12,13,16) z4 0z1; z4 1; z3 2z0; z3 3 48 3 Y 2 (9,12,19,19) z5 0z1; z4 1z0; z3 2; z2 2z3; z2z3 3; z3 3 57 5 Y 2 (9,19,24,31) z9 0; z3 1z2; z3 2z0; z2 3z1 81 3 Y 2 (10,19,35,43) z7 0z2; z5 1z0; z3 2; z2 3z1 105 3 Y 2 (11,21,28,47) z7 0z2; z5 1; z3 2z1; z2 3z0 105 3 Y 2 (11,25,32,41) z6 0z3; z3 1z2; z3 2z0; z2 3z1 107 3 Y 2 (11,25,34,43) z10 0 ; z4 1z0; z2 2z3; z2 3z1 111 3 Y 2 (11,43,61,113) z15 0 z2; z5 1z0; z3 2z1; z2 3 226 2 Y 2 (13,18,45,61) z9 0z1; z5 1z2; z3 2; z2 3z0 135 3 Y... In PAGE 22: ... But I I?n D 2 j ? KZwj so this completes the proof of the lemma. The analysis of most of the sporadic examples of Table1 is easily done with help of Corollary 3.7 which can restated for this purpose as: Corollary 5.... In PAGE 31: ...f degree wi. The simplest situation occurs when f1 = f2 = f3 are forced to vanish. Then Gw = (C )3 is the smallest it can possibly be as P(w) is toric. This is, in fact, common to many examples of the log del Pezzo suraces of Table1 . More precisely, we have Lemma 7.... In PAGE 33: ...As mentioned previously for the log del Pezzo surfaces with a Y in the last column of Table1 and the tables of Theorem 4.5, there is a unique homothety class of Kahler-Einstein metrics corresponding to each point of Md w: But the question remains whether two inequivalent Kahler-Einstein structures can share the same Riemannian metric.... ..."

Cited by 12

### Table 1: Bolivian Terms of Trade Index ..

"... In PAGE 34: ...STATISTICAL APPENDIX Table1 : Bolivian Terms of Trade Index, 1980-92 (1987=100) Year Export price index hbport price index Ternn oftrade index 1980 181.0 91.... ..."

### Table 1. Measures of patterned and unpatterned matrices as quadtrees. *Density is accurate within a term of (n?1). Sparsity is accurate within a term of ((lg n)?2).

"... In PAGE 6: ... Based on Du apos;s caveat and these numbers, I here propose measures of both density and sparsity that are motivated by results on quadtrees, but are expressed independently of any particular representation. Examples appear in Table1 , also. Density of a particular matrix is the ratio between the space it occupies, and the space occupied by a dense matrix of the same order.... In PAGE 6: ... Let us consider n n matrices. Table1 presents closed-form and asymptotic results for space, density, expected path length (root to terminal node), and sparsity for some familiar... In PAGE 18: ... If the product of nested permutations implied by those recurrences is expanded, then the permutation Fp; described below, results. It is listed as the \FFT permutation quot; in Table1 , and is called \bit-reversal quot; permutation because it exchanges xi and xb(i) in permuting ~x; where b is a function on natural numbers less than 2p that reverses the p-bit strings that represent them. Since b is its own inverse, Fp is a symmetric permutation matrix.... In PAGE 19: ... quot; The result is that each processor must endure memory delays dependent on the problem size, contrary to past intuition and unlike experience on uniprocessors. Table1 indicates that Fp measures surprisingly bad with respect to both density and sparsity. If those measures accurately re ect the resource required for actually permuting in parallel, then Table 1 also suggests a viable alternative to Fp.... In PAGE 19: ... Table 1 indicates that Fp measures surprisingly bad with respect to both density and sparsity. If those measures accurately re ect the resource required for actually permuting in parallel, then Table1 also suggests a viable alternative to Fp. The two factorizations of Fp, (9) and (10), indicate that nested Shu es (or Deals) su ce where we are accustomed to using the bit-reversal FFT permutation, Fp.... In PAGE 19: ... Each factorization of Fp has factors nested along diagonal subtrees. Table1 shows that Sp (and similarly Dp) has sparsity of nearly 1 ? 3=p and, therefore, it may be cheaper to perform nested shu es/deals on localized subproblems, building up to one simple, global Sp permutation, rather than to use the complicated global Fp permutation. Results on the resources needed for sparse matrix multiplication would establish which is better.... ..."

### Table 4: Measures of index term complexity

"... In PAGE 5: ...3%. Why did the subjects demonstrate such a strong preference for the human terms? Table4 illustrates some important differences between the human terms and the automatically identified terms. The terms selected on are longer, as measured in number of words, and more complex, as measured by number of prepositions per index terms and by number of con- tent-bearing words.... ..."

### Table 2: Number of terms in index by method of identification

"... In PAGE 4: ... Differ- ences in the implementations, especially the pre- processing module, result in there being some terms identified by Termer that were not identified by Head Sorting. Table2 shows the number of terms identified by each method. (*Because some terms are identified by more than one technique, the percentage adds up to more than 100%.... ..."

### Table 2. Average Index Terms per Document

2004

"... In PAGE 6: ... 5.1 Dimensionality Reduction The results presented in Table2 , show that RFA reduces the dimensionality of the affected documents space to a great extent. The reduction is obtained while preserving or improving the retrieval effectiveness of the system augmented with RFA (see results in Sections 5.... In PAGE 6: ... The retrieval systems ST, BR, BB and BS are considered together since none of them possesses a dimensionality reduction technique. In fact for the BR, BB and BS systems the dimensionality is even higher than the one presented in Table2 , because of the introduction of new terms. First column of Table 2 shows the minimum dimensionality, which in fact corresponds to the ST system.... In PAGE 6: ... In fact for the BR, BB and BS systems the dimensionality is even higher than the one presented in Table 2, because of the introduction of new terms. First column of Table2 shows the minimum dimensionality, which in fact corresponds to the ST system. 5.... ..."

Cited by 1

### Table 2. Number of expansion terms and penalty terms by indexing scheme.

2001

"... In PAGE 2: ... Penalty function for missing terms. As can be seen in Table2 , we use only about one- fifth of the terms of the expanded query for this ... ..."

Cited by 2