• Documents
  • Authors
  • Tables

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Recognizing action units for facial expression analysis (2001)

Cached

  • Download as a PDF

Download Links

  • [www.pitt.edu]
  • [www.pitt.edu]
  • [www.pitt.edu]
  • [www-ee.ccny.cuny.edu]
  • [media-lab.engr.ccny.cuny.edu]
  • [www-ee.ccny.cuny.edu]
  • [www.ri.cmu.edu]
  • [citeseerx.ist.psu.edu]
  • [www-ee.ccny.cuny.edu]
  • [media-lab.engr.ccny.cuny.edu]
  • [www.ri.cmu.edu]
  • [www-dev.ri.cmu.edu:8080]
  • [www.ri.cmu.edu]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Ying-li Tian , Takeo Kanade , Jeffrey F. Cohn
Venue:IEEE Transactions on Pattern Analysis and Machine Intelligence
Citations:398 - 38 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@ARTICLE{Tian01recognizingaction,
    author = {Ying-li Tian and Takeo Kanade and Jeffrey F. Cohn},
    title = {Recognizing action units for facial expression analysis},
    journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence},
    year = {2001},
    volume = {23},
    pages = {97--115}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Most automatic expression analysis systems attempt to recognize a small set of prototypic expressions (e.g. happiness and anger). Such prototypic expressions, however, occur infrequently. Human emotions and intentions are communicated more often by changes in one or two discrete facial features. We develop an automatic system to analyze subtle changes in facial expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal image sequence. Unlike most existing systems, our system attempts to recognize fine-grained changes in facial expression based on Facial Action Coding System (FACS) action units (AUs), instead of six basic expressions (e.g. happiness and anger). Multi-state face and facial component models are proposed for tracking and modeling different facial features, including lips, eyes, brows, cheeks, and their related wrinkles and facial furrows. Then we convert the results of tracking to detailed parametric descriptions of the facial features. With these features as the inputs, 11 lower face action units (AUs) and 7 upper face AUs are recognized by a neural network algorithm. A recognition rate of 96.7 % for lower face AUs and 95 % for upper face AUs is obtained respectively. The recognition results indicate that our system can identify action units regardless of whether they occurred singly or in combinations. 1.

Keyphrases

action unit    facial expression analysis    upper face au    facial furrow    facial expression    prototypic expression    discrete facial feature    subtle change    human emotion    recognition result    permanent facial feature    fine-grained change    basic expression    recognition rate    small set    frontal image sequence    face au    transient facial feature    related wrinkle    multi-state face    facial component model    different facial feature    neural network algorithm    automatic expression analysis system    automatic system    face action unit    facial feature    facial action coding system    parametric description   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University