• Documents
  • Authors
  • Tables

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 275
Next 10 →

Table 2. Neural networks versus self-organising modelling.

in Recent Developments of Self-Organising Modeling in Prediction and Analysis of Stock Market
by A.G. Ivakhnenko, J.-A. Müller 1997
Cited by 3

Table 1 lists the main self-organising characteristics, classified by domains, for the 412

in unknown title
by unknown authors
"... In PAGE 10: ... Table1 : Application Domains and Main Self-Organising Characteristics ... ..."

Table 4. Contingency tables of the feed-forward neural network and the self-organising feature map for the synthetic radiograph

in Detection of Bone Tumours in Radiographic Images using Neural Networks
by Michael Egmont-Petersen, Erich Pelikan, E. Pelikan
"... In PAGE 9: ... It obtained an overall correctness of 0.8983 based on all pixels in the synthetic image (see Table4 ). Also in this image, tissue, pathologic and, to a lesser extent, healthy bone are difficult to discern.... ..."

Table 2: Learn schedule for self-organising feature map. Learn count 12300 24600 36900 49200

in Artificial Neural Networks and Statistical Approaches to . . .
by Rudolf T. Suurmond, Rudolf T. Suurmond, Erik Bergkvist, Erik Bergkvist
"... In PAGE 8: ... The learn count determines which column of the learn schedule is active. In Table2 , the learning rate is 0.06 for the first 12 300 learning iterations, 0.... In PAGE 8: ... Decreasing the learning rate after a number of learning iterations usually gives improves the results. In Table2 the total number of learning iterations is 49200. NeuralWorks is a versatile package that supports many different neural network paradigms.... In PAGE 11: ...2 Experiments Simulations were done for three different numbers of units in the Kohonen layer. The networks were trained for 49200 iterations each, according to the learn schedule given in Table2 . Experiments were also carried out with a constant learning rate of 0.... In PAGE 11: ...Table 3: Performance of the SOM network with learn schedule given in Table2 . The 95% confidence intervals for the classification rates (lower bound, average, upper bound, respectively are listed in boldface).... In PAGE 28: ...Appendix Table2 0: Classification matrices of the best SOM net at a constant learning rate. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 157 9 0 1 0 0 0 0 167 C2 1 282 0 2 0 0 0 0 285 C3 0 0 126 0 0 2 0 0 128 C4 2 1 0 393 6 0 0 0 402 C5 0 0 2 5 95 0 0 0 102 C6 0 0 1 0 0 277 12 6 296 C7 0 0 0 0 0 63 90 0 153 C8 0 0 0 0 0 5 0 102 107 Total 160 292 129 401 101 347 102 108 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 78 5 0 0 0 0 0 0 83 C2 1 141 0 0 0 0 0 0 142 C3 0 0 63 0 0 1 0 0 64 C4 3 2 0 194 1 0 0 0 200 C5 0 3 0 0 49 0 0 0 52 C6 0 0 0 0 1 127 20 0 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 5 0 49 54 Total 82 151 63 194 51 162 68 49 820 Table 21: Classification matrices of the best performing LVQ network.... In PAGE 28: ...Appendix Table 20: Classification matrices of the best SOM net at a constant learning rate. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 157 9 0 1 0 0 0 0 167 C2 1 282 0 2 0 0 0 0 285 C3 0 0 126 0 0 2 0 0 128 C4 2 1 0 393 6 0 0 0 402 C5 0 0 2 5 95 0 0 0 102 C6 0 0 1 0 0 277 12 6 296 C7 0 0 0 0 0 63 90 0 153 C8 0 0 0 0 0 5 0 102 107 Total 160 292 129 401 101 347 102 108 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 78 5 0 0 0 0 0 0 83 C2 1 141 0 0 0 0 0 0 142 C3 0 0 63 0 0 1 0 0 64 C4 3 2 0 194 1 0 0 0 200 C5 0 3 0 0 49 0 0 0 52 C6 0 0 0 0 1 127 20 0 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 5 0 49 54 Total 82 151 63 194 51 162 68 49 820 Table2 1: Classification matrices of the best performing LVQ network. Ground truth categories Classifier apos;s categories (train set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 163 4 0 0 0 0 0 0 167 C2 2 283 0 0 0 0 0 0 285 C3 0 0 127 0 1 0 0 0 128 C4 1 2 0 390 9 0 0 0 402 C5 0 0 0 0 102 0 0 0 102 C6 0 0 1 0 1 266 7 21 296 C7 0 0 0 0 1 69 83 0 153 C8 0 0 0 0 0 0 0 107 107 Total 166 289 128 390 114 335 90 128 1640 Ground truth categories Classifier apos;s categories (test set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 81 2 0 0 0 0 0 0 83 C2 1 139 0 0 2 0 0 0 142 C3 0 0 64 0 0 0 0 0 64 C4 2 3 0 189 3 0 0 3 200 C5 0 1 0 0 51 0 0 0 52 C6 0 0 0 0 2 119 17 10 148 C7 0 0 0 0 0 33 44 0 77 C8 0 0 0 0 0 1 0 53 54 Total 84 145 64 189 58 153 61 66... In PAGE 29: ... Table2 2: Classification matrices of the best fuzzy ARTMAP. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 167 0 0 0 0 0 0 0 167 C2 0 285 0 0 0 0 0 0 285 C3 0 0 128 0 0 0 0 0 128 C4 0 0 0 402 0 0 0 0 402 C5 0 0 0 0 102 0 0 0 102 C6 0 0 0 0 0 291 3 2 296 C7 0 0 0 0 0 5 148 0 153 C8 0 0 0 0 0 0 0 107 107 Total 167 285 128 402 102 296 151 109 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 79 4 0 0 0 0 0 0 83 C2 3 133 5 0 1 0 0 0 142 C3 0 0 63 0 0 1 0 0 64 C4 2 2 0 194 1 0 0 1 200 C5 0 4 0 0 48 0 0 0 52 C6 0 0 0 0 1 99 44 4 148 C7 0 0 0 0 0 24 53 0 77 C8 0 0 0 0 0 3 0 51 54 Total 84 143 68 194 51 127 97 56 820 Table 23: Classification matrices of the best back-propagation net.... In PAGE 29: ...Table 22: Classification matrices of the best fuzzy ARTMAP. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 167 0 0 0 0 0 0 0 167 C2 0 285 0 0 0 0 0 0 285 C3 0 0 128 0 0 0 0 0 128 C4 0 0 0 402 0 0 0 0 402 C5 0 0 0 0 102 0 0 0 102 C6 0 0 0 0 0 291 3 2 296 C7 0 0 0 0 0 5 148 0 153 C8 0 0 0 0 0 0 0 107 107 Total 167 285 128 402 102 296 151 109 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 Total C1 79 4 0 0 0 0 0 0 83 C2 3 133 5 0 1 0 0 0 142 C3 0 0 63 0 0 1 0 0 64 C4 2 2 0 194 1 0 0 1 200 C5 0 4 0 0 48 0 0 0 52 C6 0 0 0 0 1 99 44 4 148 C7 0 0 0 0 0 24 53 0 77 C8 0 0 0 0 0 3 0 51 54 Total 84 143 68 194 51 127 97 56 820 Table2 3: Classification matrices of the best back-propagation net. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 158 7 0 2 0 0 0 0 167 C2 2 282 0 1 0 0 0 0 285 C3 0 0 127 0 0 1 0 0 128 C4 3 0 0 392 7 0 0 0 402 C5 0 0 2 5 95 0 0 0 102 C6 0 0 1 0 0 259 24 12 296 C7 0 0 0 0 0 60 93 0 153 C8 0 0 0 0 0 4 0 103 107 Total 163 289 130 400 102 324 117 115 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 77 4 0 2 0 0 0 0 83 C2 1 139 0 0 2 0 0 0 142 C3 0 0 64 0 0 0 0 0 64 C4 1 2 0 192 2 0 0 3 200 C5 0 2 0 0 50 0 0 0 52 C6 0 0 0 0 1 113 31 3 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 1 0 53 54 Total 79 147 64 194 55 143 79 59... In PAGE 30: ... Table2 4: Classification matrices of the best radial basis function net. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 164 3 0 0 0 0 0 0 167 C2 2 282 0 1 0 0 0 0 285 C3 0 0 127 0 0 0 1 0 128 C4 0 0 0 395 7 0 0 0 402 C5 0 0 0 5 97 0 0 0 102 C6 0 0 1 0 0 269 17 9 296 C7 0 0 0 0 0 59 94 0 153 C8 0 0 0 0 0 8 0 99 107 Total 166 285 128 401 104 336 112 108 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 81 2 0 0 0 0 0 0 83 C2 0 137 5 0 0 0 0 0 142 C3 0 0 64 0 0 0 0 0 64 C4 1 3 0 196 0 0 0 0 200 C5 0 3 0 0 49 0 0 0 52 C6 0 0 0 0 2 121 25 0 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 3 0 51 54 Total 82 145 69 196 51 153 73 51 820 Table 25: Classification matrices of a classifier based on linear discriminant analysis.... In PAGE 30: ...Table 24: Classification matrices of the best radial basis function net. Classifier apos;s categories (train set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 164 3 0 0 0 0 0 0 167 C2 2 282 0 1 0 0 0 0 285 C3 0 0 127 0 0 0 1 0 128 C4 0 0 0 395 7 0 0 0 402 C5 0 0 0 5 97 0 0 0 102 C6 0 0 1 0 0 269 17 9 296 C7 0 0 0 0 0 59 94 0 153 C8 0 0 0 0 0 8 0 99 107 Total 166 285 128 401 104 336 112 108 1640 Classifier apos;s categories (test set) Ground truth categories C1 C2 C3 C4 C5 C6 C7 C8 C1 81 2 0 0 0 0 0 0 83 C2 0 137 5 0 0 0 0 0 142 C3 0 0 64 0 0 0 0 0 64 C4 1 3 0 196 0 0 0 0 200 C5 0 3 0 0 49 0 0 0 52 C6 0 0 0 0 2 121 25 0 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 3 0 51 54 Total 82 145 69 196 51 153 73 51 820 Table2 5: Classification matrices of a classifier based on linear discriminant analysis. Ground truth categories Classifier apos;s categories (train set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 163 3 0 1 0 0 0 0 167 C2 2 280 0 0 3 0 0 0 285 C3 0 0 123 0 2 3 0 0 128 C4 7 2 0 381 12 0 0 0 402 C5 0 1 1 0 100 0 0 0 102 C6 0 0 0 0 1 247 27 21 296 C7 0 0 0 0 0 60 93 0 153 C8 0 0 0 0 0 0 0 107 107 Total 172 286 124 382 118 310 120 128 1640 Ground truth categories Classifier apos;s categories (test set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 81 2 0 0 0 0 0 0 83 C2 2 138 0 0 2 0 0 0 142 C3 0 1 62 0 0 1 0 0 64 C4 4 3 0 183 9 0 0 1 200 C5 0 2 0 0 50 0 0 0 52 C6 0 0 0 0 2 100 36 10 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 0 0 54 54 Total 87 146 62 183 63 130 84 65... In PAGE 31: ... Table2 6: Classification matrices of the K-nearest neighbor method. Ground truth categories Classifier apos;s categories (train set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 162 4 0 1 0 0 0 0 167 C2 0 283 0 0 1 0 0 0 285 C3 0 0 127 0 0 0 0 0 128 C4 3 3 0 388 7 0 0 0 402 C5 0 1 1 0 98 0 0 0 102 C6 0 0 0 0 0 259 16 20 296 C7 0 0 0 0 0 52 101 0 153 C8 0 0 0 0 0 1 0 106 107 Total 165 291 128 389 106 312 117 126 1640 Ground truth categories Classifier apos;s categories (test set) C1 C2 C3 C4 C5 C6 C7 C8 Total C1 81 2 0 0 0 0 0 0 83 C2 0 141 0 0 1 0 0 0 142 C3 0 1 62 0 0 1 0 0 64 C4 1 2 0 192 2 0 0 0 200 C5 0 2 0 0 50 0 0 0 52 C6 0 0 0 0 1 109 29 9 148 C7 0 0 0 0 0 29 48 0 77 C8 0 0 0 0 0 1 0 53 54 Total 82 148 62 192 54 140 77 62... ..."

Table 1. Performance parameters for 13 individual models self-organised by KnowledgeMiner

in Modelling and Prediction of Toxicity of Environmental Pollutants
by Frank Lemke, Johann-adolf Müller, Emilio Benfenati

Table 2: Effectiveness f Post-Processing Normalisation on Self Organising Maps

in Equity trend prediction with neural networks
by R. Halliday

Table 6.6: Response of 3555 network with q and r values of 1, 4; 16, 1.0; 16, 1.0; 16, 1.0. Trained in the self-organising manner.

in Hand Posture Recognition with the Neocognitron Network
by D. S. Banarse, D. S. Banarse

Table 2: The Relationship Between Biological Immune Features and Artificial Immune Algorithms Human Immune Features Artificial Immune Algorithms/Concepts Distributed Idiotypic Immune Network, Multi-Agent Systems, Negative Selection Multi-layered Multi-Agent Systems, Co-Stimulation Self-Organised Gene Library Evolution, Clonal Selection, Negative Selection, Local Sensitivity by Cytokine Lightweight Memory Cells, Imperfect Detection, Dynamic Cell Turnover Diverse MHC(Permutation Mask)

in Immune System Approaches to Intrusion Detection
by Jungwon Kim, Peter J. Bentley, Uwe Aickelin, Julie Greensmith, Gianni Tedesco, Jamie Twycross 2007
"... In PAGE 40: ...n section 2.2, we listed six properties of the immune system that contribute to an effective IDS. The major part of this article has provided detailed overviews of systems proposed and implemented, containing one or more immune-inspired algorithms or concepts. Table2 summarises the artificial immune algorithms and concepts that the reviewed AISs have employed. It also shows the corresponding biological immune ... ..."
Cited by 1

Table 2: The Relationship Between Biological Immune Features and Artificial Immune Algorithms Human Immune Features Artificial Immune Algorithms/Concepts Distributed Idiotypic Immune Network, Multi-Agent Systems, Negative Selection Multi-layered Multi-Agent Systems, Co-Stimulation Self-Organised Gene Library Evolution, Clonal Selection, Negative Selection, Local Sensitivity by Cytokine Lightweight Memory Cells, Imperfect Detection, Dynamic Cell Turnover Diverse MHC(Permutation Mask)

in Immune System Approaches to Intrusion Detection
by Jungwon Kim, Peter J. Bentley, Uwe Aickelin, Julie Greensmith, Gianni Tedesco, Jamie Twycross 2007
"... In PAGE 40: ...n section 2.2, we listed six properties of the immune system that contribute to an effective IDS. The major part of this article has provided detailed overviews of systems proposed and implemented, containing one or more immune-inspired algorithms or concepts. Table2 summarises the artificial immune algorithms and concepts that the reviewed AISs have employed. It also shows the corresponding biological immune ... ..."
Cited by 1

Table 1: Example for Judgement of Adaptability The evaluation of the business process dimension is accomplished by modelling process examples into an enterprise system. The evaluation is based on the patterns knowledge and customising in the first place. For example the assessment of customising checks possibilities to adjust a system manually to support a given (pre-defined) process. The scale indicates the effort to adjust the system to support the process. The range starts at zero for no possibility to customise and finishes at the stage of completely automated indicating an implemented self-organisation of the system.

in MANAGING CHANGE –DETERMINING THE ADAPTABILITY OF INFORMAITON SYSTEMS
by unknown authors
Next 10 →
Results 1 - 10 of 275
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University