关系抽取
计算机科学
变压器
关系(数据库)
人工智能
自回归模型
编码
利用
依赖关系(UML)
编码器
模式识别(心理学)
数据挖掘
机器学习
自然语言处理
数学
计算机安全
统计
操作系统
基因
物理
量子力学
生物化学
电压
化学
标识
DOI:10.1109/ijcnn55064.2022.9892140
摘要
Named entity recognition and relation extraction are two important tasks in information extraction. Many recent works model two tasks jointly and achieve great success. However, these methods still suffer from the relation semantic insufficiency, head entity dependency and nested entity detection problem. To address the challenges, we propose a relation-aware span-level transformer network (RSTN), which contains a span-level encoder for entity recognition and a non-autoregressive decoder for relation extraction. Specifically, we generate explicit represen-tations for possible spans to extract overlapping entities in our span-level encoder. In addition, we encode relation semantics in our non-autoregressive decoder, and exploit copy mechanism to extract head entities and tail entities simultaneously by modifying the casual attention mask. Through span-level multi-head attention mechanism, we enhance the interaction between entity recognition and relation extraction in our model. We evaluate our model on three public datasets: ACE05, ADE and SciERC. Experiment results show that the proposed model outperforms previous strong baseline methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI