### Table 2. Influential publications from other countries

"... In PAGE 9: ... Table2 summarises this data. In the case of most US texts it is generally observed that these are read in Spanish language translations, generally published in Mexico.... ..."

### TABLE 1 Summary of RXTE PCA Observations

### TABLE 7a. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 3 (H1) (OLS w/robust standard errors)

in Electoral Rules As Constraints On Corruption: The Risks Of Closed-List Proportional Representation

"... In PAGE 40: ...40 TABLE7 b. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 1(H2) (OLS w/robust standard errors) Dropping large STUDENT Dropping large Dfdsize Dropping large Dfclist Dropping large DFFITS Coeff p-value Coeff p-value Coeff p- value Coeff p- valu e DISTSIZE ***-0.... In PAGE 40: ...92 Obs. 54 51 53 50 TABLE7 c. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 2(H3) (OLS w/robust standard errors) Dropping large STUDENT Dropping large Dfclpres Dropping large DFFITS Coeff p-value Coeff p-value Coeff p- value CLPRES ***-0.... ..."

### TABLE 7b. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 1(H2) (OLS w/robust standard errors)

in Electoral Rules As Constraints On Corruption: The Risks Of Closed-List Proportional Representation

"... In PAGE 39: ...39 TABLE7 a. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 3 (H1) (OLS w/robust standard errors) Dropping large STUDENT Dropping large DFclpr Dropping large DFFITS Coeff p-value Coeff p-value Coeff p-value CLPR ***- 0.... In PAGE 40: ... TABLE7 c. REGRESSION DIAGNOSTICS: DROPPING INFLUENTIAL OBSERVATIONS in Model 2(H3) (OLS w/robust standard errors) Dropping large STUDENT Dropping large Dfclpres Dropping large DFFITS Coeff p-value Coeff p-value Coeff p- value CLPRES ***-0.... ..."

### Table 1. SAX J1808.4{3658. Parameters of spectral approximation of PCA and HEXTE data by a power law model Date Time, UT Expos.1,sec PCA HEXTE PCA+HEXTE

"... In PAGE 2: ...he 3{25 keV light curve and the spectra of SAX J1808.4{3658 are shown in Figs.1 and 2. In order to characterize the broad band spectral properties at di erent luminosity the power law model was used. The best t parameters are listed in Table1 . The pulsation with 2:5 msec period was detected in all observations between April 11 { 29 and on May 3 with the relative rms of 4 ? 7% increasing slightly as luminosity decreased.... In PAGE 3: ...ig. 1. The 3{25 keV light curve of SAX J1808.4{3658. The PCA uxes are those from Table1 , the ASM count rate was converted to 3{25 keV energy ux assuming a Crab like spectrum. The solid lines are LX / e?t=10d and LX / e?t=1:3d.... In PAGE 3: ...4{3658 can be summarized as follows: 1. The 3{100 keV spectrum maintained an approximate power law shape I / ?2 as luminosity decreased by a factor of 100 ( Table1 , Fig.2).... In PAGE 3: ...Fig.2). We therefore expected to observe spectral evolution as luminosity decreased by a factor of 100. Remarkably no signi cant spectral changes were detected ( Table1 , Fig.2).... ..."

### Table 2: Results are averaged over 25 realizations of 2100 training and 900 testing documents for RCV1 C, E, G, M categories using kernel PCA with expected linear kernel (and regular PCA).

in Abstract

"... In PAGE 6: ... in any kernel machine. Table2 and 3 attempt to quantify the performance of the kernel PCA under expected linear and RBF ker- nels. To this end, we formed four di erent multiple label tasks, C, E, G, and M (Table 1).... ..."

### Table 5. Average Accuracies for SVM classifier with different kernels with and without doing PCA on the test data Features SVM Kernel Type % Overall Male Female

"... In PAGE 11: ...stimated. These coefficients were then used for classification. The SVM was used for classifying the data and all the four kernel functions were tried. The results are shown in Table5 . Here, we show the results both with and without PCA for comparison.... ..."

### Table 3: RPCL on iris data after PCA

"... In PAGE 5: ... We then run RPCL on the principal components of Iris data. By choosing three principal components, the RPCL improves accuracy ( Table3 ) and requires less training epoches (100) than the RPCL operating on raw data does. We also try to enhance the KRPCL by initializing the weight- ing matrix with values of kernel principal eigenvectors, which are obtained by decomposing the kernel matrix used in the KRPCL algorithm.... ..."

### Table 3: Single subscriber with two services experimental results

1997

"... In PAGE 15: ...2.1 One Subscriber, Two Services The results of the experimentation are summarised in Table3 . The shaded boxes indicate experiments in which no feature interaction was observed to exist.... ..."

Cited by 10