计算机科学
节点(物理)
嵌入
核(代数)
特征学习
代表(政治)
卷积(计算机科学)
理论计算机科学
特征(语言学)
特征向量
人工智能
顶点(图论)
图形核
模式识别(心理学)
核方法
图形
人工神经网络
数学
支持向量机
分布的核嵌入
哲学
结构工程
组合数学
政治
法学
政治学
语言学
工程类
作者
Bo Zhang,Xiaoming Zhang,Feiran Huang,Ming Lu,Shuai Ma
标识
DOI:10.1109/tkde.2022.3153053
摘要
This paper concerns the problem of network embedding (NE), whose aim is to learn a low-dimensional representation for each node in networks. We provide a new train to solve the sparsity problem where most of nodes including the new arrival nodes have little knowledge with respect to the network. A novel paradigm is proposed to integrate the multiple information from the subgraph covering the target node instead of only the target node. Particularly, to epxress the distinctive feature over the vertex domain, a probabiltiy distribution over subgraph space is constructed for each node. The distribution is more effective to express the distinctive characteristic and feature in a higher dimension compared to the latent representation vectors. One of the primary goals of this paradigm is to define the convolution operation over the distributions, which are efficient to evaluate and learn. Experiments on four real-world network datasets demonstrate that our approach significantly outperforms state-of-the-art methods, especially on the representation learning for the nodes newly joining in the network.
科研通智能强力驱动
Strongly Powered by AbleSci AI