计算机科学
依赖关系(UML)
解析
背景(考古学)
人工智能
判决
水准点(测量)
语法
自然语言处理
利用
编码
语义学(计算机科学)
关系(数据库)
情绪分析
依存语法
数据挖掘
程序设计语言
古生物学
生物化学
化学
计算机安全
大地测量学
基因
生物
地理
作者
Pengfei Chen,Biqing Zeng,Yuwu Lu,Yun Xue,Fan Fei,Mayi Xu,Lingcong Feng
标识
DOI:10.1016/j.csl.2023.101616
摘要
Aspect-level sentiment analysis (ALSA) aims to extract the polarity of different aspect terms in a sentence. Previous works leveraging traditional dependency syntax parsing trees (DSPT) to encode contextual syntactic information had obtained state-of-the-art results. However, these works may not be able to learn fine-grained syntactic knowledge efficiently, which makes them difficult to take advantage of local context. Furthermore, these works failed to exploit the dependency relation from DSPT sufficiently. To solve these problems, we propose a novel method to enhance local knowledge by using extensions of Local Context Network based on Proximity Values (LCPV) and Syntax-clusters Attention (SCA), named LCSA. LCPV first gets the induced trees from pre-trained models and generates the syntactic proximity values between context word and aspect to adaptively determine the extent of local context. Our improved SCA further extracts fine-grained knowledge, which not only focuses on the essential clusters for the target aspect term but also guides the model to learn essential words inside each cluster in DSPT. Extensive experimental results on multiple benchmark datasets demonstrate that LCSA is highly robust and achieves state-of-the-art performance for ALSA.
科研通智能强力驱动
Strongly Powered by AbleSci AI