Results 1 
3 of
3
Solving multiclass learning problems via errorcorrecting output codes
 Journal of Artificial Intelligence Research
, 1995
"... Multiclass learning problems involve nding a de nition for an unknown function f(x) whose range is a discrete set containing k>2values (i.e., k \classes"). The de nition is acquired by studying collections of training examples of the form hx i;f(x i)i. Existing approaches to multiclass learning ..."
Abstract

Cited by 571 (9 self)
 Add to MetaCart
Multiclass learning problems involve nding a de nition for an unknown function f(x) whose range is a discrete set containing k>2values (i.e., k \classes"). The de nition is acquired by studying collections of training examples of the form hx i;f(x i)i. Existing approaches to multiclass learning problems include direct application of multiclass algorithms such as the decisiontree algorithms C4.5 and CART, application of binary concept learning algorithms to learn individual binary functions for each of the k classes, and application of binary concept learning algorithms with distributed output representations. This paper compares these three approaches to a new technique in which errorcorrecting codes are employed as a distributed output representation. We show that these output representations improve the generalization performance of both C4.5 and backpropagation on a wide range of multiclass learning tasks. We also demonstrate that this approach is robust with respect to changes in the size of the training sample, the assignment of distributed representations to particular classes, and the application of over tting avoidance techniques such as decisiontree pruning. Finally,we show thatlike the other methodsthe errorcorrecting code technique can provide reliable class probability estimates. Taken together, these results demonstrate that errorcorrecting output codes provide a generalpurpose method for improving the performance of inductive learning programs on multiclass problems. 1.
Combining Stacking With Bagging To Improve A Learning Algorithm
, 1996
"... In bagging [Bre94a] one uses bootstrap replicates of the training set [Efr79, ET93] to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking [Wol92b, Bre92] can be used in concert with the bootstrap procedure to achieve a further improve ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
In bagging [Bre94a] one uses bootstrap replicates of the training set [Efr79, ET93] to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking [Wol92b, Bre92] can be used in concert with the bootstrap procedure to achieve a further improvement on the performance of bagging for some regression problems. In particular, in some of the work presented here, one first converts a single underlying learning algorithm into several learning algorithms. This is done by bootstrap resampling the training set, exactly as in bagging. The resultant algorithms are then combined via stacking. This procedure can be viewed as a variant of bagging, where stacking rather than uniform averaging is used to achieve the combining. The stacking improves performance over simple bagging by up to a factor of 2 on the tested problems, and never resulted in worse performance than simple bagging. In other work presented here, there is no step of converting t...
An empirical comparison of hierarchical vs. twolevel approaches to multiclass problems
 in Lecture Notes in Computer Science
, 2004
"... Abstract. The ECOC framework provides a powerful and popular method for solving multiclass problems using a multitude of binary classifiers. We had recently introduced the Binary Hierarchical Classifier (BHC) architecture that addresses multiclass classification problems using a set of binary classi ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Abstract. The ECOC framework provides a powerful and popular method for solving multiclass problems using a multitude of binary classifiers. We had recently introduced the Binary Hierarchical Classifier (BHC) architecture that addresses multiclass classification problems using a set of binary classifiers arranged as a tree. Unlike ECOCs, the BHC groups classes according to their natural affinities in order to make each binary problem easier. However it cannot exploit the powerful error correcting properties of an ECOC ensemble that can provide good results even when individual classifiers are weak. Using welltuned SVMs as the base classifiers, we provide a comparison of these two diverse approaches using a variety of datasets. The results show that while there is no clear advantage to either technique in terms of classification accuracy, BHCs typically achieve this performance using fewer classifiers, and have the added advantage of automatically generating a hierarchy of classes. Such hierarchies often provide a valuable tool for extracting domain knowledge, and achieve better results when coarser granularity of the output space is acceptable. 1