计算机科学
可扩展性
图形
情绪分析
图嵌入
嵌入
理论计算机科学
平滑的
人工智能
节点(物理)
机器学习
数据挖掘
计算机视觉
结构工程
数据库
工程类
作者
Youkai Jin,Anping Zhao
标识
DOI:10.1007/s40747-023-01289-9
摘要
Abstract Numerous graph neural network (GNN) models have been used for sentiment analysis in recent years. Nevertheless, addressing the issue of over-smoothing in GNNs for node representation and finding more effective ways to learn both global and local information within the graph structure, while improving model efficiency for scalability to large text sentiment corpora, remains a challenge. To tackle these issues, we propose a novel Bert-based unlinked graph embedding (BUGE) model for sentiment analysis. Initially, the model constructs a comprehensive text sentiment heterogeneous graph that more effectively captures global co-occurrence information between words. Next, by using specific sampling strategies, it efficiently preserves both global and local information within the graph structure, enabling nodes to receive more feature information. During the representation learning process, BUGE relies solely on attention mechanisms, without using graph convolutions or aggregation operators, thus avoiding the over-smoothing problem associated with node aggregation. This enhances model training efficiency and reduces memory storage requirements. Extensive experimental results and evaluations demonstrate that the adopted Bert-based unlinked graph embedding method is highly effective for sentiment analysis, especially when applied to large text sentiment corpora.
科研通智能强力驱动
Strongly Powered by AbleSci AI