计算机科学
循环神经网络
人工智能
特征(语言学)
卷积神经网络
特征提取
模式识别(心理学)
卷积(计算机科学)
序列(生物学)
序列标记
语言模型
人工神经网络
语音识别
生物
语言学
任务(项目管理)
遗传学
哲学
经济
管理
作者
WU Zhi-min,Shiwen Wang
标识
DOI:10.1109/icaica50127.2020.9181913
摘要
Convolutional neural networks (CNN) and recurrent neural networks (RNN) are widely used in natural language processing. However, the natural language has a certain dependence on the structure. A single CNN model leads to ignore the semantic and grammatical information of words. Traditional RNN has gradient disappearance or gradient dispersion problem. For this reason, the paper designs the CNN-BLSTM network in the way of network level combination, introduces Attention mechanism, and proposes a CNN-BLSTM +Attention model. The fusion model effectively deals with the position invariance of local features and extracted efficient local feature information. Furthermore, Attention mechanism is introduced to automatically weigh the output sequence information at each time to reduce the loss of key features when RNNs are used to model the sequence features. The feature extraction in time and space is completed. The experimental results show that the accuracy of the proposed model is 3 to 4 percentage points higher than that of other models. When dealing with alarm text, the model not only guarantees the local correlation of data, but also strengthens the effective combination ability of sequence features.
科研通智能强力驱动
Strongly Powered by AbleSci AI