• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Improved Boosting Algorithms Using Confidence-rated Predictions (1999)

Cached

  • Download as a PDF

Download Links

  • [www.iro.umontreal.ca]
  • [www.cs.huji.ac.il]
  • [sci2s.ugr.es]
  • [www.cse.ucsc.edu]
  • [www-connex.lip6.fr]
  • [www-poleia.lip6.fr]
  • [www-connex.lip6.fr]
  • [www.cs.iastate.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Robert E. Schapire , Yoram Singer
Venue:MACHINE LEARNING
Citations:938 - 26 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Schapire99improvedboosting,
    author = {Robert E. Schapire and Yoram Singer},
    title = { Improved Boosting Algorithms Using Confidence-rated Predictions},
    year = {1999}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We describe several improvements to Freund and Schapire’s AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns out to be identical to one proposed by Kearns and Mansour. We focus next on how to apply the new boosting algorithms to multiclass classification problems, particularly to the multi-label case in which each example may belong to more than one class. We give two boosting methods for this problem, plus a third method based on output coding. One of these leads to a new method for handling the single-label case which is simpler but as effective as techniques suggested by Freund and Schapire. Finally, we give some experimental results comparing a few of the algorithms discussed in this paper.

Keyphrases

boosting algorithm using confidence-rated prediction    decision tree    refined criterion    third method    several improvement    single-label case    weak hypothesis    multi-label case    new method    adaboost boosting algorithm    classification problem    simplified analysis    experimental result    specific method    improved parameter setting   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University