• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 13,337
Next 10 →

Table 1: Cross-validation: incremental results

in A Statistical Approach to Anaphora Resolution
by Niyu Ge, John Hale, Eugene Charniak 1998
Cited by 98

Table 3: Incremental validity and relative relevance of the main variables from the optimized test battery

in VALIDATION OF THE DUTCH AIRFORCE TEST BATTERY USING ARTIFICIAL NEURAL NETWORKS
by Markus Sommer, Joachim Häusler, Martin Arendasy
"... In PAGE 4: ...Table3... ..."

Table 1: Validation of expressions for incremental buffer delay and slope changes

in Buffer delay change in the presence of power and ground noise
by Lauren Hui Chen, Malgorzata Marek-sadowska, Forrest Brewer 2003
"... In PAGE 10: ... We follow the method in [17] to extract a and Vtn. Table1 shows our calculated variations in delay and slope compared to HSpice simulation, for a single inverter with different parameters, in different technologies. 10 sets of data are shown.... ..."
Cited by 1

Table 5: Absolute, Incremental and Parsimonious measures of fit for the calibration and validation samples

in DEFINING AND DEVELOPING MEASURES OF LEAN PRODUCTION
by Rachna Shah, Peter T. Ward
"... In PAGE 21: ...he variance explained range was between 0.18 and 0.91 (Table 3). The multiple measures of model fit indicate a mixed picture: RMSEA, 90% confidence interval associated with RMSEA, RMR, and normed chi-square indicate a good to excellent fit, but NNFI and CFI are at or below the recommended level ( Table5 ; columns 4 and 5). The proportion of absolute standardized residuals gt; |2.... In PAGE 21: ...he measurement model incorporating the modifications described in Section 4.2.1 was retested using the validation sample. Results for the convergent validity from the validation sample are reported in Table 4 (columns 6 and 7) and measures of model fit in Table5 (column 4). The pattern and size of loadings and the variance explained is similar to those in the calibration sample.... ..."

Table 5: Rule INC-I (Incremental Invariance) The rule assumes that has been previously proven to be valid over M and that the augmented implication ^ ( apos; ! ) is a valid invariance formula over M. The conclusion is that the implication apos; ! is a valid invariance formula over M. Inference Rule 5 (Generalized Incremental Invariance(GEN-INC-I))

in Formal Verification of Real Time Specifications of a Machining System
by Dhrubajyoti Kalita, Anurag Dod

Table 5: Rule INC-I (Incremental Invariance) The rule assumes that has been previously proven to be valid over M and that the augmented implication ^ ( apos; ! ) is a valid invariance formula over M. The conclusion is that the implication apos; ! is a valid invariance formula over M. Inference Rule 5 (Generalized Incremental Invariance(GEN-INC-I)) The rule assumes that RTTFs 1; :::; k have been previously shown to be valid over M. It follows that each accessible state in M satis es the conjunction Vk i=1 i . The validity of invariance formula apos; can be established as follows. I3 $ ^ Vk i=1 i ^ ! 0

in Formal Verification for Analysis and Design of Reconfigurable Controllers for Manufacturing Systems
by Dhrubajyoti Kalita, Pramod P. Khargonekar

Table 1: Signals of the decoder. The signal v accomplishes the synchronization with the CUT, e.g. by controlling the scan clock in a scan- based BIST environment. The interaction of the other signals becomes clear by inspecting the state transition diagram of the finite state controlling the work of the decoder (see Figure 9).

in Alternating Run-Length Coding -- A Technique for Improved Test Data Compression
by Sybille Hellebrand, Armin Würtenberger

Table 10 Comparisons of incremental vs. non-incremental Incre_noexp Incre_fewexp Non-Incremental data

in Hybrid Decision Tree
by Zhi-hua Zhou, Zhao-qian Chen
"... In PAGE 16: ...are attained by averaging the results of six experiments where every subset is used in training primitive tree, in pure incremental learning, and in testing the accuracy, which can be viewed as a variation of 3-fold cross validation. In Table10 , primitive accuracy denotes the accuracy of the primitive trees, storage denotes the percentage of previous training examples that are saved, and accuracy denotes the accuracy of the final trees. Table 10 shows that both incremental procedures can significantly improve the generalization ability of the primitive trees.... In PAGE 16: ... In Table 10, primitive accuracy denotes the accuracy of the primitive trees, storage denotes the percentage of previous training examples that are saved, and accuracy denotes the accuracy of the final trees. Table10 shows that both incremental procedures can significantly improve the generalization ability of the primitive trees. As anticipated, HDT without incremental learning procedure always achieves the best accuracy because it processes new examples with the help of all the previous training examples.... ..."

Table 10 Comparisons of incremental vs. non-incremental Incre_noexp Incre_fewexp Non-Incremental data

in Hybrid Decision Tree
by Zhi-hua Zhou, Zhao-qian Chen
"... In PAGE 16: ...16 are attained by averaging the results of six experiments where every subset is used in training primitive tree, in pure incremental learning, and in testing the accuracy, which can be viewed as a variation of 3-fold cross validation. In Table10 , primitive accuracy denotes the accuracy of the primitive trees, storage denotes the percentage of previous training examples that are saved, and accuracy denotes the accuracy of the final trees. Table 10 shows that both incremental procedures can significantly improve the generalization ability of the primitive trees.... In PAGE 16: ... In Table 10, primitive accuracy denotes the accuracy of the primitive trees, storage denotes the percentage of previous training examples that are saved, and accuracy denotes the accuracy of the final trees. Table10 shows that both incremental procedures can significantly improve the generalization ability of the primitive trees. As anticipated, HDT without incremental learning procedure always achieves the best accuracy because it processes new examples with the help of all the previous training examples.... ..."

Table 4.3 gives the average accuracies using the best simple decision table found by MDL and ICV (incremental cross-validation) directly, using C4.5 with all available attributes and using C4.5 with the respective best subset of attributes. Generally the MDL-based heuristic is never signi cantly worse then ICV and on average performs slightly better. Comparison with C4.5 shows that both methods seem to select reasonable subsets of attributes, as performance of C4.5 is never signi cantly degraded when using these subsets.

in Practical Uses of the Minimum Description Length Principle in Inductive Learning
by Bernhard Pfahringer, Technisch-naturwissenschaftliche Fakultat, Dipl. -ing Bernhard Pfahringer
Next 10 →
Results 1 - 10 of 13,337
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University