Multimodal Emotion Recognition based on Facial Expressions, Speech, and EEG

计算机科学 过度拟合 人工智能 卷积神经网络 深度学习 特征提取 语音识别 面部表情 脑电图 判别式 模式识别(心理学) 人工神经网络 情绪分类 情感计算 稳健性(进化) 心理学 基因 精神科 生物化学 化学
作者
Jiahui Pan,Weijie Fang,Zhihang Zhang,Bingzhi Chen,Zhao Zhang,Shuihua Wang
出处
期刊:IEEE open journal of engineering in medicine and biology [Institute of Electrical and Electronics Engineers]
卷期号:: 1-8 被引量:1
标识
DOI:10.1109/ojemb.2023.3240280
摘要

Goal : As an essential human-machine interactive task, emotion recognition has become an emerging area over the decades. Although previous attempts to classify emotions have achieved high performance, several challenges remain open: 1) How to effectively recognize emotions using different modalities remains challenging. 2) Due to the increasing amount of computing power required for deep learning, how to provide real-time detection and improve the robustness of deep neural networks is important. Method: In this paper, we propose a deep learning-based multimodal emotion recognition (MER) called Deep-Emotion, which can adaptively integrate the most discriminating features from facial expressions, speech, and electroencephalogram (EEG) to improve the performance of the MER. Specifically, the proposed Deep-Emotion framework consists of three branches, i.e., the facial branch, speech branch, and EEG branch. Correspondingly, the facial branch uses the improved GhostNet neural network proposed in this paper for feature extraction, which effectively alleviates the overfitting phenomenon in the training process and improves the classification accuracy compared with the original GhostNet network. For work on the speech branch, this paper proposes a lightweight fully convolutional neural network (LFCNN) for the efficient extraction of speech emotion features. Regarding the study of EEG branches, we proposed a tree-like LSTM (tLSTM) model capable of fusing multi-stage features for EEG emotion feature extraction. Finally, we adopted the strategy of decision-level fusion to integrate the recognition results of the above three modes, resulting in more comprehensive and accurate performance. Result and Conclusions: Extensive experiments on the CK+, EMO-DB, and MAHNOB-HCI datasets have demonstrated the advanced nature of the Deep-Emotion method proposed in this paper, as well as the feasibility and superiority of the MER approach.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
Johnny0912完成签到,获得积分10
刚刚
刚刚
2秒前
2秒前
狂野东蒽发布了新的文献求助10
3秒前
loey发布了新的文献求助10
3秒前
Billy应助科研通管家采纳,获得30
3秒前
Billy应助科研通管家采纳,获得30
3秒前
打打应助科研通管家采纳,获得10
3秒前
无花果应助科研通管家采纳,获得10
3秒前
酷波er应助科研通管家采纳,获得10
3秒前
烟花应助科研通管家采纳,获得10
3秒前
NexusExplorer应助科研通管家采纳,获得10
4秒前
隐形曼青应助科研通管家采纳,获得10
4秒前
shuang0116应助科研通管家采纳,获得10
4秒前
李爱国应助科研通管家采纳,获得10
4秒前
汉堡包应助科研通管家采纳,获得10
4秒前
脑洞疼应助科研通管家采纳,获得10
4秒前
深情安青应助科研通管家采纳,获得10
4秒前
隐形曼青应助科研通管家采纳,获得10
4秒前
充电宝应助科研通管家采纳,获得10
4秒前
ED应助科研通管家采纳,获得10
4秒前
GGKing完成签到,获得积分10
4秒前
初商拾陆发布了新的文献求助10
4秒前
打打应助科研通管家采纳,获得10
4秒前
4秒前
在水一方应助科研通管家采纳,获得10
4秒前
共享精神应助科研通管家采纳,获得10
4秒前
HR112应助科研通管家采纳,获得10
5秒前
OIIII应助科研通管家采纳,获得20
5秒前
SciGPT应助科研通管家采纳,获得10
5秒前
5秒前
gcl应助科研通管家采纳,获得30
5秒前
qq应助科研通管家采纳,获得50
5秒前
在水一方应助科研通管家采纳,获得10
5秒前
5秒前
所所应助科研通管家采纳,获得10
5秒前
5秒前
5秒前
高分求助中
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 1370
生物降解型栓塞微球市场(按产品类型、应用和最终用户)- 2030 年全球预测 1000
Statistical Analysis of fMRI Data, second edition (Mit Press) 2nd ed 500
Lidocaine regional block in the treatment of acute gouty arthritis of the foot 400
Ecological and Human Health Impacts of Contaminated Food and Environments 400
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 360
International Relations at LSE: A History of 75 Years 308
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3932996
求助须知:如何正确求助?哪些是违规求助? 3477787
关于积分的说明 10999166
捐赠科研通 3208177
什么是DOI,文献DOI怎么找? 1772738
邀请新用户注册赠送积分活动 860008
科研通“疑难数据库(出版商)”最低求助积分说明 797435