计算机科学
情绪分析
变压器
人工智能
安全性令牌
机器学习
自然语言处理
计算机安全
量子力学
物理
电压
作者
Pengfei Li,Peixiang Zhong,Jiaheng Zhang,Kezhi Mao
标识
DOI:10.1109/ijcnn48605.2020.9206796
摘要
Given certain data available for training, the keys to improving a sentiment analysis system lie in developing a good model that is capable of capturing both local and global features of texts, as well as incorporating external knowledge into the model effectively. In this paper, we propose a multi-window Convolutional Transformer (ConvTransformer) that takes the advantages of both Transformer and CNN for sentiment analysis. The proposed ConvTransformer is able to capture important local n-gram features effectively while preserving sequential information of texts. Furthermore, we propose a sentiment-aware attention mechanism to incorporate the sentiment intensity information of each word by utilizing an external knowledge base, SentiWordNet. The sentiment-aware attention mechanism takes both sentiment and position information of each token into consideration when computing attention weights, resulting in a global feature for final classification. Comparing with CNN, RNN and attention-based baseline models, our model achieves the best performance on multiple sentiment analysis datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI