Spontaneous emotional facial expression detection

Zhihong Zeng, Yun Fu, Glenn I. Roisman, Zhen Wen, Yuxiao Hu, Thomas S. Huang

Research output: Contribution to journalArticlepeer-review

71 Scopus citations

Abstract

Change in a speaker's emotion is a fundamental component in human communication. Automatic recognition of spontaneous emotion would significantly impact human-computer interaction and emotion-related studies in education, psychology and psychiatry. In this paper, we explore methods for detecting emotional facial expressions occurring in a realistic human conversation setting-the Adult Attachment Interview (AAI). Because non-emotional facial expressions have no distinct description and are expensive to model, we treat emotional facial expression detection as a one-class classification problem, which is to describe target objects (i.e., emotional facial expressions) and distinguish them from outliers (i.e., non-emotional ones). Our preliminary experiments on AAI data suggest that one-class classification methods can reach a good balance between cost (labeling and computing) and recognition performance by avoiding non-emotional expression labeling and modeling.

Original languageEnglish (US)
Pages (from-to)1-8
Number of pages8
JournalJournal of Multimedia
Volume1
Issue number5
DOIs
StatePublished - Jan 1 2006
Externally publishedYes

Keywords

  • Affective computing
  • Emotion recognition
  • Facial expression
  • One-class classification

Fingerprint Dive into the research topics of 'Spontaneous emotional facial expression detection'. Together they form a unique fingerprint.

Cite this