亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Attention-based multimodal sentiment analysis and emotion recognition using deep neural networks

计算机科学 判别式 模式 情绪分析 人工智能 模态(人机交互) 特征(语言学) 可视化 特征提取 机器学习 深度学习 模式识别(心理学) 社会科学 语言学 哲学 社会学
作者
Ajwa Aslam,Allah Bux Sargano,Zulfiqar Habib
出处
期刊:Applied Soft Computing [Elsevier BV]
卷期号:144: 110494-110494 被引量:20
标识
DOI:10.1016/j.asoc.2023.110494
摘要

There has been a growing interest in multimodal sentiment analysis and emotion recognition in recent years due to its wide range of practical applications. Multiple modalities allow for the integration of complementary information, improving the accuracy and precision of sentiment and emotion recognition tasks. However, working with multiple modalities presents several challenges, including handling data source heterogeneity, fusing information, aligning and synchronizing modalities, and designing effective feature extraction techniques that capture discriminative information from each modality. This paper introduces a novel framework called "Attention-based Multimodal Sentiment Analysis and Emotion Recognition (AMSAER)" to address these challenges. This framework leverages intra-modality discriminative features and inter-modality correlations in visual, audio, and textual modalities. It incorporates an attention mechanism to facilitate sentiment and emotion classification based on visual, textual, and acoustic inputs by emphasizing relevant aspects of the task. The proposed approach employs separate models for each modality to automatically extract discriminative semantic words, image regions, and audio features. A deep hierarchical model is then developed, incorporating intermediate fusion to learn hierarchical correlations between the modalities at bimodal and trimodal levels. Finally, the framework combines four distinct models through decision-level fusion to enable multimodal sentiment analysis and emotion recognition. The effectiveness of the proposed framework is demonstrated through extensive experiments conducted on the publicly available Interactive Emotional Dyadic Motion Capture (IEMOCAP) dataset. The results confirm a notable performance improvement compared to state-of-the-art methods, attaining 85% and 93% accuracy for sentiment analysis and emotion classification, respectively. Additionally, when considering class-wise accuracy, the results indicate that the "angry" emotion and "positive" sentiment are classified more effectively than the other emotions and sentiments, achieving 96.80% and 93.14% accuracy, respectively.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
9秒前
大雄先生发布了新的文献求助10
38秒前
1分钟前
科研通AI2S应助科研通管家采纳,获得10
1分钟前
科研通AI5应助灼才采纳,获得10
2分钟前
2分钟前
asd1576562308完成签到 ,获得积分10
2分钟前
扫地888完成签到 ,获得积分10
3分钟前
3分钟前
Foxjker完成签到 ,获得积分10
3分钟前
斯文败类应助Pearl采纳,获得10
4分钟前
Yakamoz完成签到 ,获得积分10
4分钟前
4分钟前
糖醋里脊加醋完成签到 ,获得积分10
4分钟前
Pearl发布了新的文献求助10
4分钟前
chaotianjiao完成签到 ,获得积分10
4分钟前
Pearl完成签到,获得积分10
5分钟前
5分钟前
深情安青应助科研通管家采纳,获得10
5分钟前
传奇3应助科研通管家采纳,获得10
5分钟前
深情安青应助科研通管家采纳,获得10
5分钟前
zz发布了新的文献求助10
5分钟前
jenningseastera应助Kevin采纳,获得30
5分钟前
5分钟前
5分钟前
5分钟前
布布完成签到,获得积分10
5分钟前
布布发布了新的文献求助20
6分钟前
6分钟前
Lucas应助zch19970203采纳,获得10
6分钟前
6分钟前
狒狒发布了新的文献求助10
6分钟前
6分钟前
6分钟前
狒狒完成签到,获得积分10
6分钟前
6分钟前
hugeyoung发布了新的文献求助10
6分钟前
hugeyoung完成签到,获得积分10
7分钟前
科研通AI2S应助科研通管家采纳,获得10
7分钟前
7分钟前
高分求助中
Les Mantodea de Guyane: Insecta, Polyneoptera [The Mantids of French Guiana] 2500
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 1370
Future Approaches to Electrochemical Sensing of Neurotransmitters 1000
生物降解型栓塞微球市场(按产品类型、应用和最终用户)- 2030 年全球预测 1000
壮语核心名词的语言地图及解释 900
Digital predistortion of memory polynomial systems using direct and indirect learning architectures 500
Canon of Insolation and the Ice-age Problem 380
热门求助领域 (近24小时)
化学 医学 材料科学 生物 工程类 有机化学 生物化学 物理 内科学 计算机科学 纳米技术 复合材料 化学工程 遗传学 基因 物理化学 催化作用 光电子学 量子力学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3916633
求助须知:如何正确求助?哪些是违规求助? 3462008
关于积分的说明 10920551
捐赠科研通 3189495
什么是DOI,文献DOI怎么找? 1763013
邀请新用户注册赠送积分活动 853205
科研通“疑难数据库(出版商)”最低求助积分说明 793747