Results 1 
4 of
4
A Theory of Multiclass Boosting
"... Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct ” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct ” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requirements on the weakclassifier, as well as design the most effective, in a certain sense, boosting algorithms that assume such requirements. 1
Optimal amortized regret in every interval
"... Consider the classical problem of predicting the next bit in a sequence of bits. A standard performance measure is regret (loss in payoff) with respect to a set of experts. For example if we measure performance with respect to two constant experts one that always predicts 0’s and another that alway ..."
Abstract
 Add to MetaCart
(Show Context)
Consider the classical problem of predicting the next bit in a sequence of bits. A standard performance measure is regret (loss in payoff) with respect to a set of experts. For example if we measure performance with respect to two constant experts one that always predicts 0’s and another that always predicts 1’s it is well known that one can get regret O( T) with respect to the best expert by using, say, the weighted majority algorithm [LW89]. But this algorithm does not provide performance guarantee in any interval. There are other algorithms (see [BM07, FSSW97, Vov99]) that ensure regret O( x log T) in any interval of length x. In this paper we show a randomized algorithm that in an amortized sense gets a regret of O( x) for any interval when the sequence is partitioned into intervals arbitrarily. We empirically estimated the constant in the O() for T upto 2000 and found it to be small – around 2.1. We also experimentally evaluate the efficacy of this algorithm in predicting high frequency stock data. ∗This work was done while this author was at Microsoft Research.
A Theory of Multiclass Boosting A Theory of Multiclass Boosting
"... Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct ” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we ..."
Abstract
 Add to MetaCart
(Show Context)
Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct ” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requirements on the weakclassifier, as well as design the most effective, in a certain sense, boosting algorithms that assume such requirements.