计算机科学
编码
依赖关系(UML)
依赖关系图
人工智能
判决
合并(版本控制)
图形
自然语言处理
理论计算机科学
情报检索
生物化学
化学
基因
作者
Donghao Zhang,Zhenyu Liu,Weiqiang Jia,Fei Wu,Hui Liu,Jianrong Tan
标识
DOI:10.1109/tkde.2023.3289879
摘要
Dependency-based models are widely used to extract semantic relations in text. Most existing dependency-based models establish stacked structures to merge contextual and dependency information, which encode the contextual information first and then encode the dependency information. However, this unidirectional information flow weakens the representation of words in the sentence, which further restricts the performance of existing models. To establish bidirectional information flow, a dual attention graph convolutional network (DAGCN) with a parallel structure is proposed. Most importantly, DAGCN can build multi-turn interactions between contextual and dependency information to imitate the multi-turn looking-back actions of human beings. In addition, multi-layer adjacency matrix-aware multi-head attention (AMAtt), including context-to-dependency attention and dependency-to-context attention, is carefully designed as a merge mechanism in the parallel structure to preserve the structural information of sentences and dependency trees during interactions. Furthermore, DAGCN is evaluated on the popular PubMed dataset, TACRED dataset and SemEval 2010 Task 8 dataset to demonstrate its validity. Experimental results show that our model outperforms the existing dependency-based models.
科研通智能强力驱动
Strongly Powered by AbleSci AI