Ricci Curvature-Based Graph Sparsification for Continual Graph Representation Learning

计算机科学 理论计算机科学 图形 曲率 计算 人工智能 拓扑(电路) 机器学习 算法 数学 组合数学 几何学
作者
Xikun Zhang,Dongjin Song,Dacheng Tao
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:: 1-13 被引量:1
标识
DOI:10.1109/tnnls.2023.3303454
摘要

Memory replay, which stores a subset of historical data from previous tasks to replay while learning new tasks, exhibits state-of-the-art performance for various continual learning applications on the Euclidean data. While topological information plays a critical role in characterizing graph data, existing memory replay-based graph learning techniques only store individual nodes for replay and do not consider their associated edge information. To this end, based on the message-passing mechanism in graph neural networks (GNNs), we present the Ricci curvature-based graph sparsification technique to perform continual graph representation learning. Specifically, we first develop the subgraph episodic memory (SEM) to store the topological information in the form of computation subgraphs. Next, we sparsify the subgraphs such that they only contain the most informative structures (nodes and edges). The informativeness is evaluated with the Ricci curvature, a theoretically justified metric to estimate the contribution of neighbors to represent a target node. In this way, we can reduce the memory consumption of a computation subgraph from O(dL) to O(1) and enable GNNs to fully utilize the most informative topological information for memory replay. Besides, to ensure the applicability on large graphs, we also provide the theoretically justified surrogate for the Ricci curvature in the sparsification process, which can greatly facilitate the computation. Finally, our empirical studies show that SEM outperforms state-of-the-art approaches significantly on four different public datasets. Unlike existing methods, which mainly focus on task incremental learning (task-IL) setting, SEM also succeeds in the challenging class incremental learning (class-IL) setting in which the model is required to distinguish all learned classes without task indicators and even achieves comparable performance to joint training, which is the performance upper bound for continual learning.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
飘逸锦程完成签到 ,获得积分10
1秒前
Nichel完成签到,获得积分10
2秒前
乐正向东完成签到,获得积分10
3秒前
林深发布了新的文献求助10
3秒前
4秒前
Y91完成签到,获得积分10
5秒前
小果子完成签到 ,获得积分10
5秒前
llzuo发布了新的文献求助10
5秒前
6秒前
CipherSage应助发呆的剧本采纳,获得10
6秒前
gjww应助sugar采纳,获得10
6秒前
科研小白完成签到,获得积分10
7秒前
Y91发布了新的文献求助10
7秒前
8秒前
合适冬云完成签到,获得积分10
9秒前
iVANPENNY应助春田采纳,获得10
9秒前
微笑紫真完成签到,获得积分10
9秒前
lf完成签到,获得积分10
10秒前
10秒前
迷人棒棒糖完成签到,获得积分20
10秒前
11秒前
中国女孩完成签到,获得积分10
11秒前
11秒前
shinysparrow应助科研通管家采纳,获得10
11秒前
情怀应助科研通管家采纳,获得10
11秒前
SciGPT应助科研通管家采纳,获得10
11秒前
12秒前
桐桐应助科研通管家采纳,获得10
12秒前
在水一方应助科研通管家采纳,获得10
12秒前
12秒前
orixero应助科研通管家采纳,获得10
12秒前
12秒前
共享精神应助lifeng采纳,获得10
12秒前
蒲寸完成签到 ,获得积分10
13秒前
高强发布了新的文献求助10
13秒前
情怀应助周周采纳,获得10
14秒前
你香发布了新的文献求助10
14秒前
chx8830316完成签到,获得积分10
16秒前
和谐的阁发布了新的文献求助10
16秒前
Dorodoro发布了新的文献求助50
16秒前
高分求助中
Thermodynamic data for steelmaking 3000
Teaching Social and Emotional Learning in Physical Education 900
Cardiology: Board and Certification Review 400
[Lambert-Eaton syndrome without calcium channel autoantibodies] 340
Transformerboard III 300
Erbium(III) Triflate: A Valuable Catalyst for the Rearrangement of Epoxides to Aldehydes and Ketones 200
危重疾病评分工具集 200
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2360115
求助须知:如何正确求助?哪些是违规求助? 2067350
关于积分的说明 5163848
捐赠科研通 1795771
什么是DOI,文献DOI怎么找? 897082
版权声明 557648
科研通“疑难数据库(出版商)”最低求助积分说明 478870