• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 5,893
Next 10 →

Table 8. Summary of the results, expressed in terms of gener- alization accuracy.

in International Journal of Pattern Recognition and Artificial Intelligence c ○ World Scientific Publishing Company AUTOMATIC CLASSIFICATION OF DIGITAL PHOTOGRAPHS BASED ON DECISION FORESTS
by Raimondo Schettini, Carla Brambilla, Claudio Cusano

Table 1: Various SDM designs speci ed from the gener- alized design

in Kanerva's Sparse Distributed Memory: An Object-Oriented Implementation on the Connection Machine
by Andreas Turk, Günther Görz

Table 1. Some results with gener alize d bise ction and a variant of Algorithm 5.2

in (1.1) æ A Review of Preconditioners for the Interval Gauss–Seidel Method
by R. Baker Kearfott, Chenyi Hu, Manuel Novoa Iii

Table 3: Overview of average generalization accuracies obtained with ib1-gr, fambl-gr and rise on increas- ing portions of the gp, gs, and ms data sets. Gener- alization accuracy denotes the percentage of correctly classi ed test instances. `| apos; means that the experi- ment could not be performed.

in Instance-Family Abstraction in Memory-Based Language Learning
by Antal van den Bosch 1999
"... In PAGE 6: ... rise was not applied to the full data sets due to the computational limitations mentioned earlier. The generalization accuracies yielded by the three learning algorithms on the three tasks are listed in Table3... ..."
Cited by 5

Table 4.4: Best accuracies for phone classiflcation on the validation set using models generated from 3, 4 and 5 monophone iterations plus 3 gener- alized triphone iterations and the tree in Figure 4.5.

in Clustering Linear Dynamic Models for the use of context-dependent models on Speech Recognition
by Jordi Adell, Supervised Simon King 2003

Table 1. When l tested the tangent planes at these points with the algorithm of the previous section, l found a volume of 7.97690, which is smaller than 8.0, the volume for the Voronoi cell of D4. More extreme examples may exist. Thus D4 does not give the minimum solution to the gener- alized problem, though it may still be the best for the strict kissing sphere problem.

in unknown title
by unknown authors

Tables 4.1, 4.2, and 4.3 show the performance of the multilevel scheme for the gener- alized Stokes problem on a two dimensional domain with mesh sizes 64 64, 128 128, and 256 256. The number of unknowns for these problems is 12159, 48895, and 196095, respectively. The column titled p indicates the number of processors used; and the columns titled T (n; p), Tcomp, and Tcomm show the total time (in seconds), time spent on computation, and time spent in communication, respectively, to solve the problem. The speedup and e ciency on p processors with respect to 4 processors are shown in the columns S and E, respectively.

in Efficient Iterative Methods For Saddle Point Problems
by Vivek Sarin

Table 1: Errors and convergence rates for the x-component of the pressure gradient.

in A Cartesian Grid Projection Method for the Incompressible Euler Equations in Complex Geometries
by Ann S. Almgren, John Bell, Phillip Colella, Tyler Marthaler 1997
Cited by 13

Table 2: Errors and convergence rates for the y-component of the pressure gradient.

in A Cartesian Grid Projection Method for the Incompressible Euler Equations in Complex Geometries
by Ann S. Almgren, John Bell, Phillip Colella, Tyler Marthaler 1997
Cited by 13

Table 2: keeping back one example as an unseen case, we nd once again that the gener- alization performance is typically worse than chance, see Figure 2. This tends to con rm the hypothesis that BP relies primarily on statistical information extracted from the example set. Generalization and Statistical Neutrality A statistically neutral mapping has no correlation between input values and output values. Thus, if the mapping has any input/output rule at all, it cannot be contingent upon states of individual input variables. It must be based on relational states between those variables. In other words, statistically neutral mappings have relational input/output rules.1 1We de ne a problem as relational if the output value depends upon relations between input variables, and not upon any individual input variable.

in Can Artificial Neural Networks Discover Useful
by Regularities Jv Stone, Jv Stone 1995
"... In PAGE 5: ...- gt; yes computer consumes moisture - gt; no computer consumes silicon - gt; yes computer dislikes heat - gt; yes computer dislikes electricity - gt; no computer dislikes moisture - gt; yes computer dislikes silicon - gt; no We can establish the neutrality of this training set empirically by tabulat- ing the relevant conditional probabilities. ( Table2 shows the complete set of probabilities which have a rst or zeroth-order condition.) Note that, in con- trast to parity problems, the expected value of input component values are not at chance-level.... ..."
Cited by 4
Next 10 →
Results 1 - 10 of 5,893
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University