### Table 18. Neural network specification for the neural controller

1999

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### TABLE I A COMPARISON OF ERROR RATES FOR DIFFERENT FEATURE EXTRACTORS. EACH FEATURE EXTRACTOR WAS TRAINED, OFFLINE, USING THE SAME DATA. RBF = RADIAL BASIS FUNCTION FEATURE EXTRACTOR. CNN = CONVOLUTIONAL NEURAL NETWORK FEATURE EXTRACTOR.

### Table 1. Comparison of the HCMAC neural network with the MHCMAC neural network Models

"... In PAGE 15: ... D. Comparison of HCMAC Neural Network with the MHCMAC Neural Network Table1 compares the HCMAC neural network with the MHCMAC neural network in terms of memory requirement, topology structure and input feature assignment approach. Table 1 shows that the memory requirement of the original HCMAC neural network grows with the power 2 of the ceiling logarithm of the input dimensions, but the memory requirement of the MHCMAC neural network grows only linearly with the input feature dimensions.... In PAGE 15: ... Comparison of HCMAC Neural Network with the MHCMAC Neural Network Table 1 compares the HCMAC neural network with the MHCMAC neural network in terms of memory requirement, topology structure and input feature assignment approach. Table1 shows that the memory requirement of the original HCMAC neural network grows with the power 2 of the ceiling logarithm of the input dimensions, but the memory requirement of the MHCMAC neural network grows only linearly with the input feature dimensions. Moreover, the learning structure of the self-organizing HCMAC neural network is expanded based on a full binary tree topology, but the MHCMAC neural network is expanded based on an exact binary tree topology.... ..."

### Table 2 Neural network configurations

"... In PAGE 4: ...utput. A separate neural network was trained for identification of each solvent. The inputs corresponded to Raman spectra of mixtures and the single output corresponded to a prediction whether or not the solvent was present in the mixture. The neural network configurations for each solvent are detailed in Table2 . These settings were found through experimentation.... ..."

Cited by 2

### Table 3: Neural Network Results

"... In PAGE 8: ... This gives back the cost of the solution. Table3 illustrates the testing results of the five experiments. Table 3: Neural Network Results ... ..."

### Table 6: Neural network results.

"... In PAGE 5: ... Figure 5: Signal space for neural network. Table6 shows that the network using the 7- signal characteristic set gave the correct result 93.... ..."

### Table 5: Options for Neural Networks

1998

"... In PAGE 4: ... All but IBM have advanced learning options and employ cross-validation to govern when to stop. Table5 summarizes these properties. Table 5: Options for Neural Networks... ..."

Cited by 4