## On the Relationship Between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions (1996)

Venue: | NEURAL COMPUTATION |

Citations: | 47 - 6 self |

### BibTeX

@ARTICLE{Niyogi96onthe,

author = {Partha Niyogi and Federico Girosi},

title = {On the Relationship Between Generalization Error, Hypothesis Complexity, and Sample Complexity for Radial Basis Functions},

journal = {NEURAL COMPUTATION},

year = {1996},

volume = {8},

pages = {819--842}

}

### OpenURL

### Abstract

Feedforward networks are a class of regression techniques that can be used to learn to perform some task from a set of examples. The question of generalization of network performance from a finite training set to unseen data is clearly of crucial importance. In this article we first show that the generalization error can be decomposed in two terms: the approximation error, due to the insufficient representational capacity of a finite sized network, and the estimation error, due to insufficient information about the target function because of the finite number of samples. We then consider the problem of approximating functions belonging to certain Sobolev spaces with Gaussian Radial Basis Functions. Using the above mentioned decomposition we bound the generalization error in terms of the number of basis functions and number of examples. While the bound that we derive is specific for Radial Basis Functions, a number of observations deriving from it apply to any approximation t...