A multi-layer fusion-based facial expression recognition approach with optimal weighted AUs

Xibin Jia, Shuangqiao Liu, David Powers, Barry Cardiff

Research output: Contribution to journalArticle

3 Citations (Scopus)
2 Downloads (Pure)

Abstract

Affective computing is an increasingly important outgrowth of Artificial Intelligence, which is intended to deal with rich and subjective human communication. In view of the complexity of affective expression, discriminative feature extraction and corresponding high-performance classifier selection are still a big challenge. Specific features/classifiers display different performance in different datasets. There has currently been no consensus in the literature that any expression feature or classifier is always good in all cases. Although the recently updated deep learning algorithm, which uses learning deep feature instead of manual construction, appears in the expression recognition research, the limitation of training samples is still an obstacle of practical application. In this paper, we aim to find an effective solution based on a fusion and association learning strategy with typical manual features and classifiers. Taking these typical features and classifiers in facial expression area as a basis, we fully analyse their fusion performance. Meanwhile, to emphasize the major attributions of affective computing, we select facial expression relative Action Units (AUs) as basic components. In addition, we employ association rules to mine the relationships between AUs and facial expressions. Based on a comprehensive analysis from different perspectives, we propose a novel facial expression recognition approach that uses multiple features and multiple classifiers embedded into a stacking framework based on AUs. Extensive experiments on two public datasets show that our proposed multi-layer fusion system based on optimal AUs weighting has gained dramatic improvements on facial expression recognition in comparison to an individual feature/classifier and some state-of-the-art methods, including the recent deep learning based expression recognition one.

Original languageEnglish
Article number112
Number of pages23
JournalApplied Sciences (Switzerland)
Volume7
Issue number2
DOIs
Publication statusPublished - 24 Jan 2017

Bibliographical note

'This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited'

Keywords

  • Action units (AUs)
  • Association rules
  • Facial expression recognition
  • Feature fusion
  • Multi-layer ensemble

Fingerprint Dive into the research topics of 'A multi-layer fusion-based facial expression recognition approach with optimal weighted AUs'. Together they form a unique fingerprint.

  • Cite this