Two-Stage Dynamic Fusion Framework for Multimodal Classification Tasks

计算机科学 融合 阶段(地层学) 人工智能 机器学习 数据挖掘 古生物学 哲学 语言学 生物
作者
Shoumeng Ge,Ying Chen
出处
期刊:Informs Journal on Computing
标识
DOI:10.1287/ijoc.2023.0448
摘要

Multimodal learning has provided an opportunity to better analyze a system or phenomenon. Numerous classification studies have developed advanced dynamic fusion methods to fuse information from different modalities. However, few works have considered a reliable design of dynamic fusion methods based on theoretical insights. In this context, we address the research gaps as follows. From a theoretical perspective, we initially establish the performance range for the accuracy of a multimodal classifier. Subsequently, we derive a condition based on the upper limit of the range to indicate how to improve the accuracy of the model. From a technical perspective, we propose a two-stage dynamic fusion framework according to this condition. In the first stage, we design an uncertainty-aware dynamic fusion method. In the second stage, we propose a regression-based method to adaptively generate the learned fusion weight for each modality. In the experiment, we use seven existing models for comparisons and exploit four public data sets to examine the effectiveness of the two-stage framework. The results indicate that our proposed framework generally outperforms existing methods in terms of accuracy and robustness. Additionally, we conduct a comprehensive discussion from several aspects to further illustrate the merits of the proposed framework. History: Accepted by Ram Ramesh, Area Editor for Data Science and Machine Learning. Funding: This study was supported by the China National Key R&D Program [Grant 2022YFB3305500], the National Natural Science Foundation of China [Grants 72121001, 72101066, and 72131005], the Heilongjiang Natural Science Excellent Youth Fund [Grant YQ2022G004], and the Key Research and Development Projects of Heilongjiang Province [Grant JD22A003]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0448 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0448 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Amy完成签到 ,获得积分10
2秒前
CKK关闭了CKK文献求助
6秒前
6秒前
美丽电源发布了新的文献求助30
6秒前
glanceofwind完成签到 ,获得积分10
7秒前
HEIKU应助谦让成协采纳,获得10
7秒前
tianliyan完成签到 ,获得积分10
8秒前
CodeCraft应助老唐采纳,获得10
12秒前
sugarballer发布了新的文献求助10
13秒前
小张在进步完成签到,获得积分10
16秒前
润泽完成签到,获得积分10
17秒前
牛黄完成签到 ,获得积分10
18秒前
量子星尘发布了新的文献求助10
19秒前
20秒前
万能的小叮当完成签到,获得积分10
21秒前
cuber完成签到 ,获得积分10
22秒前
冰魂应助我的采纳,获得20
23秒前
洁净的士晋完成签到,获得积分10
23秒前
大菊完成签到,获得积分10
24秒前
24秒前
mao应助sirhai采纳,获得50
26秒前
Iwan完成签到,获得积分10
26秒前
liao980109发布了新的文献求助10
27秒前
小周完成签到 ,获得积分10
27秒前
bdsb完成签到,获得积分10
27秒前
坏人123发布了新的文献求助30
30秒前
慈祥的花瓣完成签到,获得积分10
31秒前
俗人完成签到,获得积分10
32秒前
娟娟完成签到 ,获得积分10
33秒前
l玖应助科研通管家采纳,获得10
35秒前
FelixChen应助科研通管家采纳,获得10
36秒前
dm应助科研通管家采纳,获得10
36秒前
36秒前
小蘑菇应助科研通管家采纳,获得10
36秒前
爆米花应助科研通管家采纳,获得10
36秒前
Orange应助科研通管家采纳,获得10
36秒前
JamesPei应助科研通管家采纳,获得10
36秒前
科研通AI2S应助科研通管家采纳,获得10
36秒前
Owen应助科研通管家采纳,获得10
36秒前
dm应助科研通管家采纳,获得10
37秒前
高分求助中
Les Mantodea de Guyane: Insecta, Polyneoptera [The Mantids of French Guiana] 2000
The Oxford Encyclopedia of the History of Modern Psychology 2000
Chinesen in Europa – Europäer in China: Journalisten, Spione, Studenten 1200
Deutsche in China 1920-1950 1200
Synthesis of 21-Thioalkanoic Acids of Corticosteroids 1000
Electron microscopy study of magnesium hydride (MgH2) for Hydrogen Storage 1000
Applied Survey Data Analysis (第三版, 2025) 850
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3883932
求助须知:如何正确求助?哪些是违规求助? 3426217
关于积分的说明 10747497
捐赠科研通 3151073
什么是DOI,文献DOI怎么找? 1739223
邀请新用户注册赠送积分活动 839646
科研通“疑难数据库(出版商)”最低求助积分说明 784734