## On Learning the Derivatives of an Unknown Mapping with Multilayer Feedforward Networks (1989)

Citations: | 67 - 7 self |

### BibTeX

@MISC{Gallant89onlearning,

author = {A. Ronald Gallant and Professor Halbert White},

title = {On Learning the Derivatives of an Unknown Mapping with Multilayer Feedforward Networks},

year = {1989}

}

### Years of Citing Articles

### OpenURL

### Abstract

Daniel F. Mccaffrey, and Douglas W. Nychka for helpful discussions relating to Recently, multiple input, single output, single hidden layer, feedforward neural networks have been shown to be capable of approximating a nonlinear map and its partial derivatives. Specifically, neural nets have been shown to be dense in various Sobolev spaces (Hornik, Stinchcombe and White, 1989). Building upon this result, we show that a net can be trained so that the map and its derivatives are learned. Specifically, we use a result of Gallant (1987b) to show that least squares and similar estimates are strongly consistent in Sobolev norm provided the number of hidden units and the size of the training set increase together. We illustrate these results by an applic~tion to the inverse problem of chaotic dynamics: recovery of a nonlinear map from a time series of iterates. These results extend automatically to nets that embed the single hidden layer, feedforward network as a special case. 1.1 1.

### Citations

2732 |
Learning internal representations by error propagation
- Rumelhart, Hinton, et al.
- 1986
(Show Context)
Citation Context ...r, feedforward network as a special case.1.1 1. INTRODUCTION Recently, Gallant and White (1988) have demonstrated that multiple input, single output, single hidden layer, feedforward networks (e.g., =-=Rumelhart, Hinton, and Williams, 1986-=-) with a particular choice of a monotone squashing function at the hidden layer and no squashing at the output layer can approximate any square integrable function to any desired accuracy by increasin... |

723 | Cross-validatory choice and assessment of statistical predictions - Stone - 1974 |

217 |
Nonlinear statistical models
- Gallant
- 1987
(Show Context)
Citation Context ...n b. For instance, 9* may be assumed not to be on the boundary of b or might be assumed to have more derivatives than membership in b would imply. the estimation space and IIgll the consistency norm (=-=Gallant, 1987-=-). in b is called3.1 3. THE SENSE OF THE APPROXIMATION, COMPACTNESS, AND DENSENESS As seen from Theorem 0 the quality of our results is determined by the consistency norm. The stronger is this norm, ... |

165 | Nonlinear signal processing using neural networks: prediction and system modeling - Lapedes, Farber |

159 | Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks.” Neural networks 3 - Hornik, Stinchcombe, et al. - 1990 |

132 | Semi-Nonparametric Maximum Likelihood Estimators - GALLANT, NYCHKA - 1987 |

105 | A unified theory of estimation and inference for nonlinear dynamic models - Gallant, White - 1988 |

62 |
Some asymptotic results for learning in single hidden-layer feedforward network models
- WHITE
- 1989
(Show Context)
Citation Context ...ks have been shown to be capable of approximating a nonlinear map and its partial derivatives. Specifically, neural nets have been shown to be dense in various Sobolev spaces (Hornik, Stinchcombe and =-=White, 1989-=-). Building upon this result, we show that a net can be trained so that the map and its derivatives are learned. Specifically, we use a result of Gallant (1987b) to show that least squares and similar... |

54 | On fitting a recalcitrant series: the pound/dollar exchange rate - Gallant, Hsieh, et al. - 1991 |

52 | There exists a neural network that does not make avoidable mistakes - Gallant, White - 1992 |

43 | Lyapunov exponents from time series - Eckmann, Oliffson-Kamphorst, et al. - 1986 |

25 | A Graduate Course in Probability - Tucker - 1967 |

14 | Identification and consistency in seminonparametric regression - Gallant - 1987 |

10 | An Elasticity Can be Estimated Consistently Without A Priori Knowledge of Functional FonD," Econometrica - Elbadawi, Gallant, et al. - 1983 |

7 |
Deterministic chaos an introduction, Second revised ed
- Schuster
- 1988
(Show Context)
Citation Context ...pace is chosen so as to contain the mappings that are to be learned. In some applications, notably robotics (Jordan, 1989), demand analysis (Elbadawi, Gallant, and Souza, 1983), and chaotic dynamics (=-=Schuster, 1988-=-), approximation of the mapping will not suffice. Close approximation to both the mapping and the derivatives of the mapping are required in these applications. Hornik, Stinchcombe, and White (1989) h... |

3 | Nagashima T (1979) A numerical approach to ergodic problem of dissipative dynamical systems - Shimada |

2 | 1985),"ldentification et Convergence en Regression Semi-Nonparametrique," Annals de l'INSEE 59/60 - Gallant, Ronald |

2 | On Asymptotic Normality when the Number of Regressors Increases and Estimates the Minimum Eigenvalue of X'X/n Decreases," North Carolina Institute of Statistics Mimeograph Series No
- Gallant
- 1989
(Show Context)
Citation Context ...,2,f. The values of the weights P j and 1ij that minimize 1 n 2 sn(gK) = - L [x t - gK(x t - S ' ... , x t _ I )] n t=1 A A were determined using the Gauss-Newton nonlinear least squares algorithm A (=-=Gallant, 1989-=-, Ch. I). We found it helpful to zig-zag by first holding Pj A A fixed and iterating on the 1ij' then holding the 1ij fixed and iterating on A the P j , and so on a few times before going to the full ... |

1 | Nonlinear Prediction of Chaotic Time Series - Casdag1i - 1989 |