计算机科学
面部表情
可视化
追踪
人工智能
面部识别系统
分类器(UML)
面子(社会学概念)
人脸检测
情绪识别
模式识别(心理学)
语音识别
人机交互
机器学习
社会学
操作系统
社会科学
作者
Tashreef Abdullah Araf,Hisham Alabduljabbar,Sadullah Karimi,Md. Golam Rabiul Alam
标识
DOI:10.1109/icaect54875.2022.9807868
摘要
One of the most indicative ways of communication is facial expressions. The Face attributes are the contended mode to specify human sensitivity. Hence facial emotion recognition is necessary for human-machine interaction systems. The AI nowadays can also understand emotions verifying facial movement and intimation like a human brain does. But tracing the mechanism of AI is challenging as most of the AI methods are referred to as "Black box". To perceive the insights of AI algorithms the term Explainable AI has been brought to light. Explainable AI is a need to implement and build proper, fair, and responsible models that can even be flexible to use on a large production basis. In this paper, Cascade Classifier for emotion recognition and Grad-CAM for visualization of model detection has been employed. The region of interest of the face is located to extract features which are categorized into 7 classes. The results obtained are appreciable and can be applied in works relating to human expression detection.
科研通智能强力驱动
Strongly Powered by AbleSci AI