计算机科学
关系抽取
依赖关系(UML)
关系(数据库)
实体链接
自然语言处理
判决
人工智能
信息抽取
情报检索
组分(热力学)
语义关系
数据挖掘
知识库
神经科学
认知
物理
热力学
生物
作者
Yangsheng Xu,Jiaxin Tian,Mingwei Tang,Linping Tao,Liuxuan Wang
标识
DOI:10.1016/j.csl.2023.101574
摘要
Document-level Relation Extraction(DocRE) aims to extract relations between entities from documents. In contrast to sentence-level relation extraction, it requires extracting semantic relations from multiple sentences. It is necessary to further improve the performance of the above algorithm in order to extract document-level relation. Therefore, the DocRE algorithms have to deal with more complex entity structure relationships and the need to unite semantic relationships between different sentences when reasoning about relationships between entities. The proposed algorithms fail to infer relationships between entities when dealing with complex entity structure relationships. In this paper, we propose an entity mentions deep attention framework that efficiently infers entity relationships through entity structure and contextual information. Firstly, a structural dependency module of entities is designed to achieve interaction between different mentions of the entity. Secondly, a deep contextual attention component proposed to enrich the semantic information between entities by entity-related contexts. Finally, we use a distance mapping component to solve the problem of entity pairs that are far away from each other. According to our implementation results, our model outperforms the state-ofthe-art models on three public datasets DocRED, DGA, and CDR.
科研通智能强力驱动
Strongly Powered by AbleSci AI