• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Exploiting feature hierarchy for transfer learning in named entity recognition (2008)

Cached

  • Download as a PDF

Download Links

  • [www.cs.cmu.edu]
  • [www.cs.cmu.edu]
  • [www-2.cs.cmu.edu]
  • [www.cs.cmu.edu]
  • [aclweb.org]
  • [www.aclweb.org]
  • [www.aclweb.org]
  • [wing.comp.nus.edu.sg]
  • [aclweb.org]
  • [aclweb.org]
  • [www.aclweb.org]
  • [wing.comp.nus.edu.sg]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Andrew Arnold , Ramesh Nallapati , William W. Cohen
Venue:In ACL:HLT ’08
Citations:13 - 3 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Arnold08exploitingfeature,
    author = {Andrew Arnold and Ramesh Nallapati and William W. Cohen},
    title = {Exploiting feature hierarchy for transfer learning in named entity recognition},
    booktitle = {In ACL:HLT ’08},
    year = {2008}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We present a novel hierarchical prior structure for supervised transfer learning in named entity recognition, motivated by the common structure of feature spaces for this task across natural language data sets. The problem of transfer learning, where information gained in one learning task is used to improve performance in another related task, is an important new area of research. In the subproblem of domain adaptation, a model trained over a source domain is generalized to perform well on a related target domain, where the two domains’ data are distributed similarly, but not identically. We introduce the concept of groups of closely-related domains, called genres, and show how inter-genre adaptation is related to domain adaptation. We also examine multitask learning, where two domains may be related, but where the concept to be learned in each case is distinct. We show that our prior conveys useful information across domains, genres and tasks, while remaining robust to spurious signals not related to the target domain and concept. We further show that our model generalizes a class of similar hierarchical priors, smoothed to varying degrees, and lay the groundwork for future exploration in this area. 1

Keyphrases

entity recognition    transfer learning    feature hierarchy    multitask learning    feature space    supervised transfer learning    target domain    related task    natural language data set    similar hierarchical prior    source domain    domain adaptation    novel hierarchical prior structure    inter-genre adaptation    spurious signal    prior conveys useful information    domain data    common structure    future exploration    learning task    related target domain    important new area    closely-related domain   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University