## Sample compression, learnability, and the Vapnik-Chervonenkis dimension (1995)

### Cached

### Download Links

- [www.cse.ucsc.edu]
- [ftp.cse.ucsc.edu]
- [www.aciri.org]
- DBLP

### Other Repositories/Bibliography

Venue: | MACHINE LEARNING |

Citations: | 60 - 3 self |

### BibTeX

@INPROCEEDINGS{Floyd95samplecompression,,

author = {Sally Floyd and Manfred Warmuth},

title = {Sample compression, learnability, and the Vapnik-Chervonenkis dimension},

booktitle = {MACHINE LEARNING},

year = {1995},

pages = {269--304},

publisher = {}

}

### Years of Citing Articles

### OpenURL

### Abstract

Within the framework of pac-learning, we explore the learnability of concepts from samples using the paradigm of sample compression schemes. A sample compression scheme of size k for a concept class C ` 2 X consists of a compression function and a reconstruction function. The compression function receives a finite sample set consistent with some concept in C and chooses a subset of k examples as the compression set. The reconstruction function forms a hypothesis on X from a compression set of k examples. For any sample set of a concept in C the compression set produced by the compression function must lead to a hypothesis consistent with the whole original sample set when it is fed to the reconstruction function. We demonstrate that the existence of a sample compression scheme of fixed-size for a class C is sufficient to ensure that the class C is pac-learnable. Previous work has shown that a class is pac-learnable if and only if the Vapnik-Chervonenkis (VC) dimension of the class i...