• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 714
Next 10 →

Table 3. The results of ANN

in BUSINESS FAILURE PREDICTION WITH SUPPORT VECTOR MACHINES AND NEURAL NETWORKS: A COMPARATIVE STUDY *
by Jae H. Min, Young-chan Lee
"... In PAGE 9: ...on-bankruptcy: 75.00%). Three-layer BP ANN In ANN, each data set is split into three subsets: a training set, a test set and a validation set of 60%, 20%, and 20% of the data, respectively. Table3 shows the results of three-layer BP ANN. As the training epoch increases, the prediction accuracy of training set becomes higher.... In PAGE 10: ...As you see in Table3 , the best prediction accuracy of ANN (75.96%) to the test data is similar to that of SVM (76.... ..."

TABLE II ANN SUMMARY

in unknown title
by unknown authors

TABLE III SETTINGS FOR THE ANN

in Conventional Vs. Neuro-Conventional Segmentation Techniques for Handwriting Recognition: A Comparison
by M. Blumenstein, B. Verma

Table 1. ANN Parameters

in A Neural Network based Technique for Data Compression
by B. Verma, M. Blumenstein, S. Kulkarni

Table 1 ANN architectures

in Application of artificial neural network methods for the lightning performance evaluation of Hellenic high voltage transmission lines
by L. Ekonomou, I. F. Gonos, D. P. Iracleous, I. A. Stathopulos 2006

Table 4.5: Overall Accuracy (%) Using Mid(R-L)/FRONT, Mid(R-L)/HIND and Other Variables, Data Configurations 1 and 2, Evaluation Data Sets

in Decision Support Canine Gait Analysis and Diagnosis using Artificial Neural Networks
by Makiko Kaijima, Makiko Kaijima, Walter D. Potter 2005

Table 2: Comparison between standalone ANN and GA with ANN after 40

in Feature Selection for ANNs Using Genetic Algorithms
by L. B. Jack 1999
"... In PAGE 5: ...2. Genetic Algorithm with ANN after 40 Generations Table2 shows the performance of the di#0Berent feature sets after running under the GA for 40 generations. All of the datasets have their best performance in excess of 97.... ..."
Cited by 1

Table 10: Characteristics of ANN and ES Expert Systems ANN

in unknown title
by unknown authors
"... In PAGE 9: ...able 9: Financial Categorization of 18 Firms; An Example..................................... 39 Table10 : Characteristics of ANN and ES .... In PAGE 51: ... The main goal in the expert networks research is to create a synergy by the combination of the advantages of each basic technology. Table10... ..."

Table 1. Settings for the ANN Experiment

in A Neural Based Segmentation and Recognition Technique for Handwritten Words
by M. Blumenstein, B. Verma 1998
"... In PAGE 3: ...1. The most successful settings for the ANN can be seen in Table1 . The settings which remained constant through all experiments included: learning rate and momentum, both set to 0.... ..."
Cited by 10

Table 1. Comparison of ANN and HIFAM.

in unknown title
by unknown authors 1997
"... In PAGE 5: ... The results reported for ANN are taken from [8] and rep- resent a set of experiments with various runs and different architectures using the RPROP-algorithm (12 architectures, 3 runs per architecture, at most 3000 epochs 36 trials per data set). Table1 shows the results of the HIFAM in compar- ison to those of the ANN (showing the best result for each data set bold face). The tree structures and input partitions for each run of a HIFAM test were set manually and chosen by trial and error, the number of trials per data set always being clearly below 36 (2 or 3 different structures, from 2 to 10 fuzzy sets per input, evenly distributed).... ..."
Cited by 2
Next 10 →
Results 1 - 10 of 714
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2016 The Pennsylvania State University