Results 1  10
of
52
Ultraconservative Online Algorithms for Multiclass Problems
 Journal of Machine Learning Research
, 2001
"... In this paper we study online classification algorithms for multiclass problems in the mistake bound model. The hypotheses we use maintain one prototype vector per class. Given an input instance, a multiclass hypothesis computes a similarityscore between each prototype and the input instance and th ..."
Abstract

Cited by 320 (21 self)
 Add to MetaCart
In this paper we study online classification algorithms for multiclass problems in the mistake bound model. The hypotheses we use maintain one prototype vector per class. Given an input instance, a multiclass hypothesis computes a similarityscore between each prototype and the input instance
Multiclass learnability and the ERM principle
 In COLT, volume 19 of JMLR Proceedings
, 2011
"... Abstract We study the sample complexity of multiclass prediction in several learning settings. For the PAC setting our analysis reveals a surprising phenomenon: In sharp contrast to binary classification, we show that there exist multiclass hypothesis classes for which some Empirical Risk Minimizer ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Abstract We study the sample complexity of multiclass prediction in several learning settings. For the PAC setting our analysis reveals a surprising phenomenon: In sharp contrast to binary classification, we show that there exist multiclass hypothesis classes for which some Empirical Risk
Online multiclass learning by interclass hypothesis sharing
 Proc. 23rd International Conference on Machine Learning
, 2006
"... We describe a general framework for online multiclass learning based on the notion of hypothesis sharing. In our framework sets of classes are associated with hypotheses. Thus, all classes within a given set share the same hypothesis. This framework includes as special cases commonly used constructi ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
We describe a general framework for online multiclass learning based on the notion of hypothesis sharing. In our framework sets of classes are associated with hypotheses. Thus, all classes within a given set share the same hypothesis. This framework includes as special cases commonly used
Using Output Codes to Boost Multiclass Learning Problems
 MACHINE LEARNING: PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE, 1997 (ICML97)
, 1997
"... This paper describes a new technique for solving multiclass learning problems by combining Freund and Schapire's boosting algorithm with the main ideas of Dietterich and Bakiri's method of errorcorrecting output codes (ECOC). Boosting is a general method of improving the accuracy of a giv ..."
Abstract

Cited by 113 (8 self)
 Add to MetaCart
This paper describes a new technique for solving multiclass learning problems by combining Freund and Schapire's boosting algorithm with the main ideas of Dietterich and Bakiri's method of errorcorrecting output codes (ECOC). Boosting is a general method of improving the accuracy of a
Optimal Learners for Multiclass Problems
"... The fundamental theorem of statistical learning states that for binary classification problems, any Empirical Risk Minimization (ERM) learning rule has close to optimal sample complexity. In this paper we seek for a generic optimal learner for multiclass prediction. We start by proving a surprising ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
surprising result: a generic optimal multiclass learner must be improper, namely, it must have the ability to output hypotheses which do not belong to the hypothesis class, even though it knows that all the labels are generated by some hypothesis from the class. In particular, no ERM learner is optimal
Optimal Learners for Multiclass Problems
"... The fundamental theorem of statistical learning states that for binary classification problems, any Empirical Risk Minimization (ERM) learning rule has close to optimal sample complexity. In this paper we seek for a generic optimal learner for multiclass prediction. We start by proving a surprising ..."
Abstract
 Add to MetaCart
surprising result: a generic optimal multiclass learner must be improper, namely, it must have the ability to output hypotheses which do not belong to the hypothesis class, even though it knows that all the labels are generated by some hypothesis from the class. In particular, no ERM learner is optimal
Constraint classification: A new approach to multiclass classification and ranking
 In Advances in Neural Information Processing Systems 15
, 2002
"... We introduce constraint classification, a framework capturing many flavors of multiclass classification including multilabel classification and ranking, and present a metaalgorithm for learning in this framework. We provide generalization bounds when using a collection of k linear functions to repr ..."
Abstract

Cited by 86 (6 self)
 Add to MetaCart
to represent each hypothesis. We also present empirical and theoretical evidence that constraint classification is more powerful than existing methods of multiclass classification. 1
24th Annual Conference on Learning Theory Multiclass Learnability
"... Multiclass learning is an area of growing practical relevance, for which the currently available theory is still far from providing satisfactory understanding. We study the learnability of multiclass prediction, and derive upper and lower bounds on the sample complexity of multiclass hypothesis clas ..."
Abstract
 Add to MetaCart
Multiclass learning is an area of growing practical relevance, for which the currently available theory is still far from providing satisfactory understanding. We study the learnability of multiclass prediction, and derive upper and lower bounds on the sample complexity of multiclass hypothesis
Multiclass Learning, Boosting, and ErrorCorrecting Codes
 Proceedings of the Twelfth Annual Conference on Computational Learning Theory
, 1999
"... We focus on methods to solve multiclass learning problems by using only simple and efficient binary learners. We investigate the approach of Dietterich and Bakiri [2] based on errorcorrecting codes (which we call ECC). We distill error correlation as one of the key parameters influencing the perfo ..."
Abstract

Cited by 49 (0 self)
 Add to MetaCart
We focus on methods to solve multiclass learning problems by using only simple and efficient binary learners. We investigate the approach of Dietterich and Bakiri [2] based on errorcorrecting codes (which we call ECC). We distill error correlation as one of the key parameters influencing
Stable Learning in Coding Space for MultiClass Decoding and Its Extension for MultiClass Hypothesis Transfer Learning
"... Many prevalent multiclass classification approaches can be unified and generalized by the output coding framework [1, 7] which usually consists of three phases: (1) coding, (2) learning binary classifiers, and (3) decoding. Most of these approaches focus on the first two phases and predefined di ..."
Abstract
 Add to MetaCart
Many prevalent multiclass classification approaches can be unified and generalized by the output coding framework [1, 7] which usually consists of three phases: (1) coding, (2) learning binary classifiers, and (3) decoding. Most of these approaches focus on the first two phases and pre
Results 1  10
of
52