AdaFN-AG: Enhancing multimodal interaction with Adaptive Feature Normalization for multimodal sentiment analysis

规范化(社会学) 模式治疗法 计算机科学 情绪分析 人工智能 多通道交互 特征(语言学) 模式识别(心理学) 人机交互 心理学 语言学 社会学 心理治疗师 哲学 人类学
作者
Weilong Liu,Hua Xu,Yu Hua,Yunxian Chi,Kai Gao
出处
期刊:Intelligent systems with applications [Elsevier]
卷期号:23: 200410-200410 被引量:1
标识
DOI:10.1016/j.iswa.2024.200410
摘要

In multimodal sentiment analysis, achieving effective fusion among text, acoustic, and visual modalities for enhanced sentiment prediction is a crucial research topic. Recent studies typically employ tensor-based or attention-based mechanisms for multimodal fusion. However, the former fails to achieve satisfactory prediction performance, and the latter complicates the computation of fusion between non-textual modalities. Therefore, this paper proposes the multimodal sentiment analysis model based on Adaptive Feature Normalization and Attention Gating mechanism (AdaFN-AG). Firstly, facing highly synchronized non-textual modalities, we design the Adaptive Feature Normalization (AdaFN) method, which focuses more on sentiment features interaction rather than timing features association. In AdaFN, acoustic and visual modality features achieve cross-modal interaction through normalization, inverse normalization, and mix-up operations, with weights utilized for adaptive strength regulation of the cross-modal interaction. Meanwhile, we design the Attention Gating mechanism that facilitates cross-modal interactions between textual and non-textual modalities through cross-attention and captures timing associations, while the gating module concurrently regulates the intensity of these interactions. Additionally, we employ self-attention to capture the intrinsic correlations within single-modal features. Subsequently, we conduct experiments on three benchmark datasets for multimodal sentiment analysis, with the results indicating that AdaFN-AG outperforms the baselines across the majority of evaluation metrics. Through research and experiments, we validate that AdaFN-AG not only enhances performance by adopting appropriate methods for different types of cross-modal interactions while conserving computational resources but also verifies the generalization capability of the AdaFN method.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
852应助虚心的岩采纳,获得10
刚刚
2秒前
2秒前
xiaooooo完成签到,获得积分20
2秒前
BJYX发布了新的文献求助20
2秒前
科研小白完成签到,获得积分10
3秒前
周小鱼发布了新的文献求助10
3秒前
xxxxxxh发布了新的文献求助10
5秒前
Rational发布了新的文献求助10
5秒前
6秒前
6秒前
跑快点发布了新的文献求助10
7秒前
卡皮巴拉发布了新的文献求助10
7秒前
Billy应助bofu采纳,获得30
7秒前
想看文献的人完成签到,获得积分10
8秒前
科研通AI5应助nana采纳,获得10
8秒前
summitekey完成签到 ,获得积分10
9秒前
善学以致用应助lyz666采纳,获得10
9秒前
平淡妙松完成签到,获得积分20
9秒前
山乞凡完成签到 ,获得积分10
10秒前
充电宝应助飞天大野猪采纳,获得10
10秒前
10秒前
10秒前
xxxxxxh完成签到,获得积分10
10秒前
11秒前
11秒前
11秒前
张子烜发布了新的文献求助10
11秒前
思源应助112233采纳,获得10
12秒前
12秒前
yuan发布了新的文献求助10
13秒前
跑快点完成签到,获得积分10
13秒前
Amb1tionG完成签到,获得积分10
14秒前
BallQ完成签到,获得积分10
14秒前
14秒前
随遇而安应助bofu采纳,获得10
15秒前
苏苏发布了新的文献求助10
15秒前
16秒前
科研虎发布了新的文献求助10
16秒前
林黛玉完成签到 ,获得积分10
16秒前
高分求助中
The world according to Garb 600
Разработка метода ускоренного контроля качества электрохромных устройств 500
Mass producing individuality 500
Chinesen in Europa – Europäer in China: Journalisten, Spione, Studenten 500
Arthur Ewert: A Life for the Comintern 500
China's Relations With Japan 1945-83: The Role of Liao Chengzhi // Kurt Werner Radtke 500
Two Years in Peking 1965-1966: Book 1: Living and Teaching in Mao's China // Reginald Hunt 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3820683
求助须知:如何正确求助?哪些是违规求助? 3363576
关于积分的说明 10423882
捐赠科研通 3081997
什么是DOI,文献DOI怎么找? 1695408
邀请新用户注册赠送积分活动 815083
科研通“疑难数据库(出版商)”最低求助积分说明 768856