## Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension (1994)

### Download From

IEEE### Download Links

- [ftp.cse.ucsc.edu]
- [www.research.att.com]
- [classes.cec.wustl.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | Machine Learning |

Citations: | 108 - 12 self |

### BibTeX

@INPROCEEDINGS{Haussler94boundson,

author = {David Haussler and Michael Kearns and Robert Schapire},

title = {Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension},

booktitle = {Machine Learning},

year = {1994},

pages = {61--74},

publisher = {Morgan Kaufmann}

}

### Years of Citing Articles

### OpenURL

### Abstract

In this paper we study a Bayesian or average-case model of concept learning with a twofold goal: to provide more precise characterizations of learning curve (sample complexity) behavior that depend on properties of both the prior distribution over concepts and the sequence of instances seen by the learner, and to smoothly unite in a common framework the popular statistical physics and VC dimension theories of learning curves. To achieve this, we undertake a systematic investigation and comparison of two fundamental quantities in learning and information theory: the probability of an incorrect prediction for an optimal learning algorithm, and the Shannon information gain. This study leads to a new understanding of the sample complexity of learning in several existing models. 1 Introduction Consider a simple concept learning model in which the learner attempts to infer an unknown target concept f , chosen from a known concept class F of f0; 1g-valued functions over an instance space X....