计算机科学
人工智能
背景(考古学)
命名实体识别
模棱两可
图层(电子)
召回
词(群论)
机制(生物学)
自然语言处理
模式识别(心理学)
任务(项目管理)
古生物学
语言学
化学
哲学
管理
有机化学
认识论
程序设计语言
经济
生物
作者
Xiaoni Yang,Yuelei Xiao
标识
DOI:10.1109/icnlp55136.2022.00035
摘要
When processing Chinese named entity recognition, the traditional algorithm model have been having the ambiguity of expressive words and the singleness of the word vector, and the training consequence of the algorithm model was not well. To solve this problem, a BERT-MBiGRU-CRE model was proposed to increase the accuracy of Named Entity Recognition (NER). This model used a Multilayer Bidirectional Gated Recurrent Unit (MBiGRU) network to replace the Bidirectional Long Short-Term Memory (BiLSTM) network in the BERT-BiLSTM-CRE model, which can extract global context semantic features more effectively. Next, a BERT-MBiGRU-MS-CRF model was suggested based on the model. It added a layer of Multi-Head Selfttention (MS) mechanism behind the MBiGRU layer, which can efficiently extract multiple semantic features and overcome the deficiencies of MBiGRU to get local features. Finally, the experimental results on the MSRA dataset showed that the training time of the two models was significantly lower than that of the BERT-BiLSTM-CRE model, but the accuracy, recall and F1 value of them were significantly higher than those of the BERT-BiLSTM-CRE model, reaching 97.26%, 97.53% and 97.39% respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI