计算机科学
人工智能
卷积神经网络
Softmax函数
水准点(测量)
情绪分析
深度学习
特征(语言学)
学习迁移
图层(电子)
代表(政治)
模式识别(心理学)
机器学习
自然语言处理
大地测量学
有机化学
化学
哲学
法学
地理
政治
语言学
政治学
作者
Eniafe Festus Ayetiran
标识
DOI:10.1016/j.knosys.2022.109409
摘要
Deep neural networks (dnn) techniques for aspect-based sentiment classification have been widely studied. The success of these methods depends largely on training data which are often inadequate because of the rigor involved in manually tagging large collection of opinionated texts. Attempts have been made to transfer knowledge from document-level to aspect-level sentiment task. However, the success of this approach is also dependent on the model because aspect sentiment data like other type of texts contain complex semantic features. In this paper, we present an attention-based deep learning technique which jointly learns on document and aspect-level sentiment data and which also transfers learning from the document-level data to aspect-level sentiment classification. It basically consists of a convolutional layer and a bidirectional long short-term memory (Bilstm) layer. The first variant of our technique uses convolutional neural network (cnn) to extract high-level semantic features. The output of the feature extraction is then fed into the Bilstm layer which captures the contextual feature representation of the texts. The second variant applies the Bilstm layer directly on the input data. In both variants, the output hidden representation is passed to an output layer using softmax activation function for sentiment polarity classification. We evaluate our model on four standard benchmark datasets which shows the effectiveness of our approach with improvements over baselines. We also conduct ablation studies to show the effect of the different document-level weights on the learning techniques.
科研通智能强力驱动
Strongly Powered by AbleSci AI