• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

DMCA

Wrappers for Feature Subset Selection (1997)

Cached

  • Download as a PDF

Download Links

  • [www.cs.tu.ac.th]
  • [ai.stanford.edu]
  • [www.cs.sunyit.edu]
  • [ai.stanford.edu]
  • [web.cs.sunyit.edu]
  • [web.cs.sunyit.edu]
  • [machine-learning.martinsewell.com]
  • [web.cs.sunyit.edu]
  • [web.cs.sunyit.edu]
  • [web.cs.sunyit.edu]
  • [robotics.stanford.edu]
  • [intranet.cs.aau.dk]
  • [ai.stanford.edu]
  • [robotics.stanford.edu]
  • [robotics.stanford.edu]
  • [ai.stanford.edu]
  • [robotics.stanford.edu]
  • [robotics.stanford.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Ron Kohavi , George H. John
Venue:AIJ SPECIAL ISSUE ON RELEVANCE
Citations:1567 - 3 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Kohavi97wrappersfor,
    author = {Ron Kohavi and George H. John},
    title = { Wrappers for Feature Subset Selection},
    year = {1997}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

In the feature subset selection problem, a learning algorithm is faced with the problem of selecting a relevant subset of features upon which to focus its attention, while ignoring the rest. To achieve the best possible performance with a particular learning algorithm on a particular training set, a feature subset selection method should consider how the algorithm and the training set interact. We explore the relation between optimal feature subset selection and relevance. Our wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain. We study the strengths and weaknesses of the wrapper approach andshow a series of improved designs. We compare the wrapper approach to induction without feature subset selection and to Relief, a filter approach to feature subset selection. Significant improvement in accuracy is achieved for some datasets for the two families of induction algorithms used: decision trees and Naive-Bayes.

Keyphrases

feature subset selection    subset selection    selection problem    learning algorithm    induction algorithm    decision tree    significant improvement    selection method    particular learning algorithm    optimal feature subset    possible performance    relevant subset    wrapper method search    improved design    particular algorithm    optimal feature    particular training    filter approach   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University