亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

UniGraph2: Learning a Unified Embedding Space to Bind Multimodal Graphs

嵌入 空格(标点符号) 计算机科学 数学 理论计算机科学 人工智能 操作系统
作者
Yufei He,Yuan Sui,Xiaoxin He,Zhaoyu Li,Yifei Sun,Bryan Hooi
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2502.00806
摘要

Existing foundation models, such as CLIP, aim to learn a unified embedding space for multimodal data, enabling a wide range of downstream web-based applications like search, recommendation, and content classification. However, these models often overlook the inherent graph structures in multimodal datasets, where entities and their relationships are crucial. Multimodal graphs (MMGs) represent such graphs where each node is associated with features from different modalities, while the edges capture the relationships between these entities. On the other hand, existing graph foundation models primarily focus on text-attributed graphs (TAGs) and are not designed to handle the complexities of MMGs. To address these limitations, we propose UniGraph2, a novel cross-domain graph foundation model that enables general representation learning on MMGs, providing a unified embedding space. UniGraph2 employs modality-specific encoders alongside a graph neural network (GNN) to learn a unified low-dimensional embedding space that captures both the multimodal information and the underlying graph structure. We propose a new cross-domain multi-graph pre-training algorithm at scale to ensure effective transfer learning across diverse graph domains and modalities. Additionally, we adopt a Mixture of Experts (MoE) component to align features from different domains and modalities, ensuring coherent and robust embeddings that unify the information across modalities. Extensive experiments on a variety of multimodal graph tasks demonstrate that UniGraph2 significantly outperforms state-of-the-art models in tasks such as representation learning, transfer learning, and multimodal generative tasks, offering a scalable and flexible solution for learning on MMGs.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
ceeray23发布了新的文献求助20
刚刚
6秒前
三三完成签到,获得积分10
11秒前
lf发布了新的文献求助10
13秒前
JiangYifan完成签到 ,获得积分10
15秒前
桐桐应助Jack采纳,获得10
17秒前
qiuqiuqiuqiu完成签到 ,获得积分10
21秒前
24秒前
qiuqiuqiuqiu关注了科研通微信公众号
25秒前
科研通AI2S应助科研通管家采纳,获得10
27秒前
高数数完成签到 ,获得积分10
33秒前
我是老大应助rayyya采纳,获得30
33秒前
34秒前
田様应助Tonyzad采纳,获得10
40秒前
43秒前
44秒前
rayyya完成签到,获得积分20
46秒前
酷波er应助kkkxzl采纳,获得10
47秒前
Jack发布了新的文献求助10
47秒前
57秒前
kkkxzl完成签到,获得积分10
58秒前
侃侃完成签到,获得积分10
58秒前
kkkxzl发布了新的文献求助10
1分钟前
量子星尘发布了新的文献求助10
1分钟前
有且仅有完成签到 ,获得积分10
1分钟前
乐乐应助紧张的皮皮虾采纳,获得10
1分钟前
WK完成签到,获得积分10
1分钟前
1分钟前
1分钟前
1分钟前
2分钟前
方的圆完成签到,获得积分10
2分钟前
量子星尘发布了新的文献求助10
2分钟前
紧张的皮皮虾完成签到,获得积分10
2分钟前
文静的峻熙完成签到,获得积分10
2分钟前
丘比特应助科研通管家采纳,获得10
2分钟前
2分钟前
2分钟前
2分钟前
2分钟前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Picture Books with Same-sex Parented Families: Unintentional Censorship 700
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 500
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
Indomethacinのヒトにおける経皮吸収 400
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 370
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3976628
求助须知:如何正确求助?哪些是违规求助? 3520735
关于积分的说明 11204575
捐赠科研通 3257428
什么是DOI,文献DOI怎么找? 1798716
邀请新用户注册赠送积分活动 877897
科研通“疑难数据库(出版商)”最低求助积分说明 806613