### Table 4 - Detailed Performance of multiple feature representations Run Representation Weighting Precision Recall

"... In PAGE 11: ...hen the probability that this quantity is greater than 6.64 is less than 0.01. Otherwise we may reject the null hypothesis in favour of the hypothesis that the two text representations have different performance when trained on the particular training set. Results and Discussion Table4 lists the detailed results of different feature representations, where 900BOW means using 900 words (bag-of-words), 4-PNE means using 4 PNEs (see Methods), 70trigger means using 70 trigger keywords. Their combinations are denoted by using + sign.... In PAGE 11: ...ethods, i.e. the binary and tr.rf methods. Besides the first 9 runs, we also adopted a simple majority voting technique to further improve the system performance. The results are shown in Table4 as Run 10 and Run 11, which simply combine the previous three runs, respectively. ... In PAGE 12: ...Some interesting observations from Table4 can be found as follows. First, using 4- PNE representation alone achieves the worst F1 value among all the feature representations.... ..."

### Table 2 Spectral Fits

"... In PAGE 5: ... Models were t to the spectra us- ing XSPEC. The results are summarized in Table2 , where the errors are at the 90% con dence level. Previous ROSAT and ASCA spectra of early-type galax- ies have indicated that they have at least two spectral components, a very hard component which may be due to X-ray binaries and/or an AGN (Matsumoto et al.... In PAGE 6: ... Indeed this is found to be the case, although the improvement is not very substantial. We list two ts in Table2 , one with the photon index a free parameter, and the other with it xed to the value found from tting the sources (x6.3) of ? = 1:20 since we expect that it is dis- crete sources that make up the hard component.... In PAGE 6: ...ount rates is 0.99. We simultaneously t the Chandra spectrum of the to- tal emission with a spectrum extracted from an identi- cal region in the ROSAT PSPC using a model combin- ing MEKAL and power-law components. The resulting t agreed with the ts to the total emission described in Table2 within the errors. Trinchieri et al.... ..."

### Table 2: Document representation, feature selection and learning algorithms used

1998

"... In PAGE 9: ... Here we take look at some of these work through the prism of three questions important for machine learning: (1) what representation is used for documents, (2) how is the high number of fea- tures dealt with and (3) which learning algorithm is used. Table2 summarizes them over some related papers in order to give an idea about the current trends. Systems given in Table 1 are included in this more detailed analysis, if there was... ..."

Cited by 2

### Table 2. Value of external representation in supporting human task performance and learning

2004

"... In PAGE 7: ...Contents (continued) Page Table2 . Value of external representation in supporting human task performance and learning .... In PAGE 40: ... The second point is that representations can improve domain learning characteristics (Cheng, 1999) by providing the learner with an external representation that encodes all relevant features of a problem space and helping to promote the integration of those features. Table2 , below, is adapted from and extends (Woods, 1994) and presents a set of core performance and learning issues that can be supported by carefully constructed representations. Table 2.... In PAGE 42: ... These are significant challenges, particularly when we consider the task dependent nature of representations. Table2 (see page ) listed a number ways that external representations impact task and learning performance. The implication of this list is that task specific representations better mesh and support schemas and improve task and learning performance.... ..."

### Table 2: Field multiplication times (in s) of our implementations for F2m on an 800 MHz Intel Pentium III. Input and output are in normal basis representation for the five rightmost columns. The compilers are GNU C 2.95 (gcc) and Intel 6.0 (icc) on Linux (kernel 2.4).

2006

"... In PAGE 13: ... Wu et al. [33, Table2 ] give sample minima (for several m 2T153; 235U) for the number of consecutive coefficients of an R-element that will permit recovery of the associated field element. Experimentally, times for Algorithm 9 for m D 163 on an Intel Pentium III are a factor 7 slower than field multiplication for a polynomial basis representation.... In PAGE 14: ... The implementation here has received limited such tuning for gcc. Table2 shows the running times from our implementation. The fastest times show that Algorithm 7 is 13% to 29% faster than the other direct multiplication algorithms for the entries with T 4, and competitive for T D 2.... In PAGE 15: ... For point operations involving only field addition, multiplication, and squaring, a polynomial-based squaring operation is sufficiently fast relative to multiplication that the squarings are typically ignored in rough estimates of point operation cost. The times in Table2 are significantly faster than in earlier papers, and suggest (at least on this platform) that multiplication for Gaussian normal bases is much closer in performance to multiplication in a poly- nomial basis than previously believed. While the difference is still sufficiently large to discourage the use of normal bases for traditional elliptic curve point operations of addition and doubling, we consider the implications for Koblitz curves and methods based on point halving.... In PAGE 16: ... Point addition requires 8 multiplications (assuming mixed coordinates). Regardless of method (basis conversion, direct, or ring mapping), Table2 suggests that the added costs of normal basis multiplication in point addition will overwhelm the relatively small savings in squarings. Point halving Halving-based methods [17, 28] replace most point doubles by a potentially faster halving operation.... ..."

Cited by 4

### Table 1. Manual and machine representations Machine-learned

2001

"... In PAGE 6: ... The ontology learner [Maedche amp;Staab, 2000] applies this method straightforwardly for ontology learning from texts to support the knowledge engineer in the ontology acquisition environment. The main problem in applying ML algorithms for OL is that the knowledge bases constructed by the ML algorithms have a flat homogeneous structure, and very often have prepositional level representation (see Table1 ). Thus several efforts focus on improving ML algorithms in terms of ability to work with complicated structures.... ..."

Cited by 23

### Table 2. Plasma Abundances Used in Spectral Modeling

in D.H. Cohen,

53

"... In PAGE 7: ... Recent detailed high resolution spectroscopy and atmosphere modeling of B stars in clusters has been carried out by Kilian (1992; 1994). In this work very precise abundances of important elements were derived (see Table2 ) and line strengths and pro les were used to determine Te and log g. Based on this analysis the stars in the study were placed on an empirical H-R diagram and compared to isochrones.... In PAGE 13: ... The three-temperature, optically thin models were computed rst from the Raymond amp; Smith (1977) plasma emission code assuming solar abundances (Anders amp; Grevesse 1989). No acceptable t was achieved, so we recomputed these models using the abundances derived from observations of the photosphere of Sco (Kilian 1994) which are listed in Table2 . The 2 t statistic was improved in this case, but it was still not consistent with a good t.... ..."

### Table 3: An GraphML representation of the multiple related domains model

2006

"... In PAGE 12: ... The domain similarity and human trust edges are depicted in Figure 6. similarTo h (D)0 h2 h1 trusts trusts h0 h (D)1 h (D)2 h (d )0 1 0 0 h (d ) h (d )0 2 h (d )0 3 similarTo similarTo similarTo similarTosimilarTo d 0societal-scale decision making systems d 1group decision support systems h0Brad h1Victor LEGEND uses uses uses Figure 6: A representation of the multiple related domains model A GraphML representation of a portion of Figure 6 is provided in Table3 . The remainder of this paper will focus specifically on the multiple related domains model.... ..."

Cited by 1

### Table 7. The QSD representation of a single-digit multiplication output.

"... In PAGE 4: ... Gathering all the outputs to produce a partial product result presents a small challenge. The QSD representation of a single digit multiplication output, shown in Table7 , contains a carry-out of magnitude 2 when the output is either -9 or 9. This prohibits the use of the second step QSD adder alone as a gatherer.... ..."

### Table 1: Best POFA expression classi cation results using four representations in an ensemble network compared to a nearest neighbor baseline and previous results with the same database by Padgett and Cottrell (1997). x denotes a 95% con dence interval on the mean obtained with multiple runs with di erent initial random weights.

1999

"... In PAGE 5: ...2 Examining representations with LDA We used Fischer apos;s Linear Discriminant to analyze the usefulness of the components of the local PCA and Gabor representations. We had also planned to use the dis- criminant analysis for feature selection, since it had improved performance in a pilot study using the lower resolution images from Padgett and Cottrell (1997), but as shown in Table1 , feature region selection does not improve performance. Appar- ently the networks are easily able to learn which components of the representation are most diagnostic for expression.... ..."

Cited by 4