• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 47,598
Next 10 →

Table 1. Applications of artificial neural networks for decision making in radiology.

in Decision Aids in Radiology
by Charles E. Kahn, Jr., Charles E. Kahn
"... In PAGE 14: ...As computers have grown increasingly powerful and the modeling software has gained sophistication, ANNs have become increasingly useful for perceptual and decision-making tasks. They have been applied to several areas in radiology ( Table1 ), and are poised to play an important role in clinical radiological decision making 9. In many cases, their ability to formulate a diagnosis from clinical data and radiographic findings has equaled or exceeded that of radiologists.... ..."

Table 7: Results of training the neural network on FAST and testing it on FAST - Case 2

in unknown title
by unknown authors
"... In PAGE 14: ... The results of the evaluation of the human expert can be seen in Table 5. After training the neural network with the first half of the Excite dataset and running it on the second half of the FAST dataset, we obtain the results in Table7 . The results of the first 20 runs of the 10 runs of the neural network is seen in Table 7.... In PAGE 14: ... After training the neural network with the first half of the Excite dataset and running it on the second half of the FAST dataset, we obtain the results in Table 7. The results of the first 20 runs of the 10 runs of the neural network is seen in Table7 . All the results could not be provided due to space considerations.... ..."

Table 9: Results of training the neural network on FAST and testing it on Excite - Case 4

in unknown title
by unknown authors
"... In PAGE 16: ... Case 4: Neural network B trained with the FAST dataset and tested with the Excite dataset: The number of topic shifts and continuations as evaluated by the human expert are given in Table 5. After training the neural network with the first half of the FAST dataset and running it on the second half of the Excite dataset, we obtain the results in Table9 . The results of the first 10 runs of the 50 runs of the neural network are seen in Table 9.... In PAGE 16: ... After training the neural network with the first half of the FAST dataset and running it on the second half of the Excite dataset, we obtain the results in Table 9. The results of the first 10 runs of the 50 runs of the neural network are seen in Table9 . All the results could not be provided due to space considerations.... ..."

Table 1. Neural Network Configuration and Operating Parameters

in Neural Network Classification of Diesel Spray Images
by S. D. Walters, S. H. Lee, C. Crua, R. J. Howlett
"... In PAGE 5: ....7.0), running on the same PC. The NDA software package is an MLP neural network, implemented in the C-language, in-house at the University of Brighton. After starting NDA, network architecture and training parameters were entered, on the Network Set-up and Training screens, respectively; these parameters are shown in Table1 . The neural network was trained using the training file and a range of different numbers of Hidden Nodes, as indicated in Table 1.... In PAGE 5: ... After starting NDA, network architecture and training parameters were entered, on the Network Set-up and Training screens, respectively; these parameters are shown in Table 1. The neural network was trained using the training file and a range of different numbers of Hidden Nodes, as indicated in Table1 . In order to retain a degree of detail in the images, thus indicating spray break-up and overall shape, 224 input data points, hence input nodes, were chosen, as described in Section 2.... ..."

TABLE 1 THE NUMBER OF COMPUTATION STEPS REQUIRED BY FAST NEURAL NETWORKS (FNN) FOR IMAGES OF SIZES (25X25 - 1050X1050 PIXELS).

in Sub-Image Detection Using Fast Neural Processors and Image Decomposition
by unknown authors

Table 1: Test Objectives in Neural Network Prediction Ex- periments

in unknown title
by unknown authors 1995
"... In PAGE 4: ... Applica- tions must have a command language user interface. Table1 lists six test objectives used to generate train- ing and test data. We also show test oracles the test case is likely to invoke (see next section for test oracle descrip- tions).... ..."
Cited by 4

Table 1. Experimental results of single view- specific neural networks

in Pose Invariant Face Recognition
by Fu Jie Huang, Zhihua Zhou, et al. 2000
"... In PAGE 5: ...From Table1 , we can see that if there is an accurate pose estimation process and the test image is fed to the right neural network, the recognition rate is about 97% on average, as shown by the diagonal line in the table. However, if the pose estimation is noisy, then the recognition ratio will drop very fast.... In PAGE 5: ...75% From Table 2 we can see that we can feed face images with all the poses and get almost the same recognition ratio around 98%. By comparing the experimental results in Table 2 with those in Table1 , we can see that even without knowing the pose information, the system achieves an average recognition ratio as high as 98.75%.... ..."
Cited by 26

Table 1 Neural network models for path planning. Authors

in
by Dmitry V. Lebedev, Jochen J. Steil, Helge J. Ritter
"... In PAGE 6: ... Comparative simulation studies and a complexity analysis are presented in Section 4, and, nally, conclusions are discussed in Section 5. 2 Review of neural network models for path planning Table1 summarises most of the existing neural network models for path plan- ning. They are ordered along two main axes: (i) with respect to the envi- ronment type (stationary vs.... In PAGE 7: ...7 fast planning in a non-stationary domain, we review for further reference and comparison the models in the second part of Table 1. f Table1 appears hereg These models generate scalar potentials over a distributed representation of the con guration space of the robot. Such potentials are an e cient alternative to analytically-described potential elds (Khatib, 1986; Rimon amp; Koditschek, 1992; Li amp; Bui, 1998; Wang amp; Chirikjian, 2000) because of their easy im- plementation and high performance.... ..."

Table 1 lists the various neural networks that are used to implement the above mentioned model of child language development, together with a specification of the typical input and output for each process that may be involved in child language development.

in Neural Networks and Child Language Development: Towards a `Conglomerate' Neural Network Simulation Architecture
by Syed Sibte Raza Abidi, Syed Sibte, Raza Abidi
"... In PAGE 3: ... Table1 : The various neural networks implementing the above-mentioned model of child language development. The table legend is IP = Input Layer, OP = Output Layer, HI = Hidden Layer, INT = Intermediate Layer 4.... ..."

Table 1 lists the various neural networks that are used to implement the above mentioned model of child language development, together with a specification of the typical input and output for each process that may be involved in child language development. Psychological

in Neural Networks and Child Language Development: Towards a `Conglomerate' Neural Network Simulation Architecture
by Syed Sibte Raza Abidi, Syed Sibte, Raza Abidi
"... In PAGE 3: ... Table1 : The various neural networks implementing the above-mentioned model of child language development. The table legend is IP = Input Layer, OP = Output Layer, HI = Hidden Layer, INT = Intermediate Layer 4.... ..."
Next 10 →
Results 1 - 10 of 47,598
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University