• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 9,886
Next 10 →

Table 2 summarizes the procedure. First, the num- ber of redundant negatives is reduced by creating the subset C, consistent with the training set. Then, the system removes those negative examples that partici- pate at Tomek links. In this way, the noisy and bor- derline examples are discarded, which leads to the new training set, T.

in Addressing the Curse of Imbalanced Training Sets: One-Sided Selection
by Miroslav Kubat, Stan Matwin 1997
"... In PAGE 4: ... Figure 4: The training set after the re- moval of redundant negative examples. Table2 : Algorithm for the one-sided selection of ex- amples. 1.... ..."
Cited by 63

Table 5 Gist classification with and without location information from the phone; the numerical score is the number of links to the gist. By biasing the classifier with the fact that the subject is in a restaurant, it becomes much easier to infer the topic of conversation from noisy transcripts.

in Machine Perception and Learning of Complex Social Systems
by Nathan Norfleet Eagle 2005
Cited by 16

Table 2: Performances of networks on the noisy nonlinear test data.

in Ensemble Learning using Decorrelated Neural Networks
by Bruce Rosen 1996
"... In PAGE 9: ... There were no direct input to output links, and each network was trained for 1000 epochs before nishing. Table2 shows the average testing prediction errors (over ten trials) for the ve individual networks when trained with Equation (1), and when trained with the added penalties given in Equation (7) with constant. Each ensemble network was composed of two to ve individual networks.... ..."
Cited by 66

TABLE XI NOISY 102405141

in Objective Functions for Training New Hidden Units in Constructive Neural Networks
by Tin-Yau Kwok, et al.

TABLE XIII NOISY 102405341

in Objective Functions for Training New Hidden Units in Constructive Neural Networks
by Tin-Yau Kwok, et al.

Table4. Retrieval on Noisy Data

in Development of Generic Search Method Based on Transformation Invariance
by Fuminori Adachi, Takashi Washio, Atsushi Fujimoto, Hiroshi Motoda, Hidemitsu Hanafusa
"... In PAGE 8: ... In contrast, when we applied the conventional retrieval approach based on the direct matching without using the FFT to derive the feature vectors, very low values of the precision and the recall were obtained as shown in Table 3 (b). Table4 represents the results for noisy data. 2 bytes are randomly chosen in each original 16 bytes sequence, and they are replaced by random numbers.... ..."

TABLE IX Noisy f(1).

in Objective Functions for Training New Hidden Units in Constructive Neural Networks
by Tin-yau Kwok, Dit-Yan Yeung 1997
Cited by 25

TABLE XI Noisy f(3).

in Objective Functions for Training New Hidden Units in Constructive Neural Networks
by Tin-yau Kwok, Dit-Yan Yeung 1997
Cited by 25

TABLE XIII Noisy f(5).

in Objective Functions for Training New Hidden Units in Constructive Neural Networks
by Tin-yau Kwok, Dit-Yan Yeung 1997
Cited by 25

Table 4: Noisy Genie experiment

in unknown title
by unknown authors 2005
Next 10 →
Results 1 - 10 of 9,886
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University