计算机科学
人工智能
关系抽取
培训(气象学)
自然语言处理
萃取(化学)
任务(项目管理)
过程(计算)
出处
期刊:Communications in computer and information science
日期:2021-09-17
卷期号:: 116-125
标识
DOI:10.1007/978-981-16-5943-0_10
摘要
Entity relation extraction (ERE) is an important task in the field of information extraction. With the wide application of pre-training language model (PLM) in natural language processing (NLP), using PLM has become a brand new research direction of ERE. In this paper, BERT is used to extracting entity-relations, and a separated pipeline architecture is proposed. ERE was decomposed into entity-relation classification sub-task and entity-pair annotation sub-task. Both sub-tasks conduct the pre-training and fine-tuning independently. Combining dynamic and static masking, new Verb-MLM and Entity-MLM BERT pre-training tasks were put forward to enhance the correlation between BERT pre-training and Targeted NLP downstream task-ERE. Inter-layer sharing attention mechanism was added to the model, sharing the attention parameters according to the similarity of the attention matrix. Contrast experiment on the SemEavl 2010 Task8 dataset demonstrates that the new MLM task and inter-layer sharing attention mechanism improve the performance of BERT on the entity relation extraction effectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI