## VC Dimension of Neural Networks (1998)

### Cached

### Download Links

Venue: | Neural Networks and Machine Learning |

Citations: | 20 - 3 self |

### BibTeX

@INPROCEEDINGS{Sontag98vcdimension,

author = {Eduardo D. Sontag},

title = {VC Dimension of Neural Networks},

booktitle = {Neural Networks and Machine Learning},

year = {1998},

pages = {69--95},

publisher = {Springer}

}

### Years of Citing Articles

### OpenURL

### Abstract

. This paper presents a brief introduction to Vapnik-Chervonenkis (VC) dimension, a quantity which characterizes the difficulty of distribution-independent learning. The paper establishes various elementary results, and discusses how to estimate the VC dimension in several examples of interest in neural network theory. 1 Introduction In this expository paper, we present a brief introduction to the subject of computing and estimating the VC dimension of neural network architectures. We provide precise definitions and prove several basic results, discussing also how one estimates VC dimension in several examples of interest in neural network theory. We do not address the learning and estimation-theoretic applications of VC dimension. (Roughly, the VC dimension is a number which helps to quantify the difficulty when learning from examples. The sample complexity, that is, the number of "learning instances" that one must be exposed to, in order to be reasonably certain to derive accurate p...