计算机科学
人工智能
表达式(计算机科学)
判别式
模式识别(心理学)
面部表情
蒸馏
领域知识
相似性(几何)
机器学习
构造(python库)
放射性检测
一般化
对偶(语法数字)
数学
图像(数学)
色谱法
艺术
数学分析
化学
文学类
程序设计语言
作者
Yante Li,Wei Peng,Guoying Zhao
标识
DOI:10.1109/fg52635.2021.9666975
摘要
Encoding facial expressions via action units (AUs) has been found to be effective in resolving the ambiguity issue among different expressions. Therefore, AU detection plays an important role for emotion analysis. While a number of AU detection methods have been proposed for common facial expressions, there is very limited study for micro-expression AU detection. Micro-expression AU detection is challenging because of the weakness of micro-expression appearance and the spontaneous characteristic leading to difficult collection, thus has small-scale datasets. In this paper, we focus on the micro-expression AU detection and expect to contribute to the community. To address above issues, a novel dual-view attentive similarity-preserving distillation method is proposed for robust micro-expression AU detection by leveraging massive facial expressions in the wild. Through such an attentive similarity-preserving distillation method, we break the domain shift problem and essential AU knowledge from common facial AUs is efficiently distilled. Furthermore, considering that the generalization ability of teacher network is important for knowledge distillation, a semi-supervised co-training approach is developed to construct a generalized teacher network for learning discriminative AU representation. Extensive experiments have demonstrated that our proposed knowledge distillation method can effectively distill and transfer the cross-domain knowledge for robust micro-expression AU detection.
科研通智能强力驱动
Strongly Powered by AbleSci AI