## Efficient noise-tolerant learning from statistical queries (1998)

### Cached

### Download Links

- [www.cs.iastate.edu]
- [www.cis.upenn.edu]
- [classes.cec.wustl.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | JOURNAL OF THE ACM |

Citations: | 288 - 6 self |

### BibTeX

@INPROCEEDINGS{Kearns98efficientnoise-tolerant,

author = {Michael Kearns},

title = {Efficient noise-tolerant learning from statistical queries},

booktitle = {JOURNAL OF THE ACM},

year = {1998},

pages = {392--401},

publisher = {ACM Press}

}

### Years of Citing Articles

### OpenURL

### Abstract

In this paper, we study the problem of learning in the presence of classification noise in the probabilistic learning model of Valiant and its variants. In order to identify the class of “robust” learning algorithms in the most general way, we formalize a new but related model of learning from statistical queries. Intuitively, in this model, a learning algorithm is forbidden to examine individual examples of the unknown target function, but is given access to an oracle providing estimates of probabilities over the sample space of random examples. One of our main results shows that any class of functions learnable from statistical queries is in fact learnable with classification noise in Valiant’s model, with a noise rate approaching the information-theoretic barrier of 1/2. We then demonstrate the generality of the statistical query model, showing that practically every class learnable in Valiant’s model and its variants can also be learned in the new model (and thus can be learned in the presence of noise). A notable exception to this statement is the class of parity functions, which we prove is not learnable from statistical queries, and for which no noise-tolerant algorithm is known.