Results 1 
3 of
3
The strength of weak learnability
 Machine Learning
, 1990
"... Abstract. This paper addresses the problem of improving the accuracy of an hypothesis output by a learning algorithm in the distributionfree (PAC) learning model. A concept class is learnable (or strongly learnable) if, given access to a Source of examples of the unknown concept, the learner with h ..."
Abstract

Cited by 667 (23 self)
 Add to MetaCart
Abstract. This paper addresses the problem of improving the accuracy of an hypothesis output by a learning algorithm in the distributionfree (PAC) learning model. A concept class is learnable (or strongly learnable) if, given access to a Source of examples of the unknown concept, the learner with high probability is able to output an hypothesis that is correct on all but an arbitrarily small fraction of the instances. The concept class is weakly learnable if the learner can produce an hypothesis that performs only slightly better than random guessing. In this paper, it is shown that these two notions of learnability are equivalent. A method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy. This construction may have practical applications as a tool for efficiently converting a mediocre learning algorithm into one that performs extremely well. In addition, the construction has some interesting theoretical consequences, including a set of general upper bounds on the complexity of any strong learning algorithm as a function of the allowed error e.
Learning Monotone DNF with an Incomplete Membership Oracle
 In Proc. 4th Annu. Workshop on Comput. Learning Theory
, 1991
"... We introduce a new faulttolerant model of algorithmic learning using an equivalence oracle and an incomplete membership oracle, in which the answers to a random subset of the learner's membership queries may be missing. We demonstrate that, with high probability, it is still possible to learn monot ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We introduce a new faulttolerant model of algorithmic learning using an equivalence oracle and an incomplete membership oracle, in which the answers to a random subset of the learner's membership queries may be missing. We demonstrate that, with high probability, it is still possible to learn monotone DNF formulas in polynomial time, provided that the fraction of missing answers is bounded by some constant. Even when half the membership queries are expected to yield no additional information, our algorithm will exactly identify mterm, nvariable monotone DNF formulas with an expected O(mn 2 ) queries. The same task has been shown to require exponential time using equivalence queries alone. Thus, this model may lead to a better understanding of the power of membership queries. 1 INTRODUCTION We introduce a new faulttolerant model of algorithmic learning using an equivalence oracle and an incomplete membership oracle, in which the answers to some of the learner's membership queries...
Submission and Formatting Instructions for the Twentyfifth International Conference on Machine Learning (ICML2008)
"... ICML2008 full paper submissions are due Feb.8, 2008. Reviewing will be blind to the identities of the authors, and therefore identifying information should not appear in any way in papers submitted for review. Submissions must be in PDF or Postscript, 8 page length limit. 1. Electronic Submission A ..."
Abstract
 Add to MetaCart
ICML2008 full paper submissions are due Feb.8, 2008. Reviewing will be blind to the identities of the authors, and therefore identifying information should not appear in any way in papers submitted for review. Submissions must be in PDF or Postscript, 8 page length limit. 1. Electronic Submission As in the past few years, ICML2008 will rely exclusively on electronic formats for submission and review. We assume that all authors will have access to standard software for word processing, electronic mail, and web file transfer. Authors who do not have such access should send email with their concerns to