### Table 1: Feature selection for the neural network problem, varying noise and number of hidden units.

### Table 1: Feature selection for the neural network problem, varying noise and number of hidden units.

### Table 3: Performance of CMixed criterion with L1-norm regularisation for three 15 frames sequences.

2004

"... In PAGE 7: ...Table3... In PAGE 7: ... From left to right: ground truth, standard resolution, super resolution and bilinear interpolation. Table3 shows that super resolution approach gives a solution a bit closer to standard resolution reconstruction than a simple bilinear interpolation approach. Partial views on Figure 4 show a good localisation of buildings even if some errors occur in form of spurious junctions between buildings, which are less a- grant with the standard resolution sequence.... ..."

Cited by 2

### Table 1 Test set Se and Sp of the neural network model in classifying cardiac beats for various training algorithms and number of units in the hidden layer

"... In PAGE 9: ... Table1 displays the experimental results obtained from the use of various training algorithms and different numbers of units in the hidden layer of the neural network. The Bayesian regularisation method, described in the previous section, was found to be the most effective.... ..."

### Table 1: Comparison of neural networks and the OTFM.

in Comparison of Neural Networks and an Optical Thin-Film Multilayer Model for Connectionist Learning

"... In PAGE 4: ... Table 1 illustrates a comparison between this framework and the OTFM. As shown in Table1 , the corresponding components of the OTFM and the neural network models are different in various ways. The OTFM consists of many thin-film layers as its basic processing units.... ..."

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### Table 2: For test function with two inputs, mean (over 50 data samples) and 95% confidence interval for standardized MSE at 225 test locations, and for the temperature and ozone datasets, cross-validated standardized MSE, for the six methods. Method Function with 2 inputs Temp. data Ozone data

2004

Cited by 4

### Table 6: Neural Network Prediction Results

1996

"... In PAGE 7: ...etwork. Fault Severity 1 through 3 needed a single hidden unit, while Severity 4 used ve. 3.4 EXPERIMENTAL RESULTS The results from training all four networks are listed in Table6 . The second column shows how well each network predicted individual fault severities.... ..."

Cited by 2

### Table 14: Learn schedule for back-propagation neural network Learn count 10000 30000 50000

"... In PAGE 19: ... We reproduced this experiment with 14 hidden units, without a pruning. We used the learn schedule displayed in Table14 to train a network with 14 units in the hidden layer. Epoch size 1 was selected for this experiment.... ..."