命名实体识别
计算机科学
预处理器
人工智能
自然语言处理
解码
词(群论)
适应性
特征(语言学)
领域(数学分析)
自然语言理解
条件随机场
语义角色标注
自然语言
解码方法
语言学
判决
生态学
任务(项目管理)
管理
经济
哲学
数学分析
生物
电信
数学
作者
Shulin Hu,Huajun Zhang,Xuesong Hu,Jinfu Du
标识
DOI:10.1109/icis54925.2022.9882514
摘要
Named entity recognition (NER) is an important research direction in natural language processing (NLP). Traditional machine learning algorithms in NER have problems such as low accuracy, highly dependent feature design, poor domain adaptability, and inability to handle the different contexts of multiple meanings of the term in recognizing Chinese entities. Based on these problems, this paper adopts a method based on the BERT-CRF model in Chinese NER. The BERT preprocessing language model generates word vectors that represent contextual semantic information, automatically extract numerous word-level features and semantic features in text, and decodes through the CRF layer generates entity tag sequences. In this paper, the BERT model has been fine-tuned to make the model perform better on NER tasks, and the experimental verification is carried out on the People's Daily dataset, and the F1 value reaches 94.5%.
科研通智能强力驱动
Strongly Powered by AbleSci AI