计算机科学
图形
人工智能
理论计算机科学
知识图
复杂网络
机器学习
作者
Peng Yan,Linjing Li,Daniel Zeng
标识
DOI:10.1016/j.knosys.2021.107557
摘要
Abstract Inspired by quantum-like phenomena in human language understanding, recent studies propose quantum probability-inspired neural networks to model natural language by treating words as superposition states and a sentence as a mixed state. However, many complex natural language processing tasks (e.g., emotion–cause pair extraction or joint dialog act recognition and sentiment classification) require modeling the complex and graphical interaction of multiple text pieces (e.g., multiple clauses in a document or multiple utterances in a dialog). The existing quantum probability-inspired neural networks only encode sequential interaction of a sequence of words, but cannot model the complex interaction of text pieces. To generalize the quantum framework from modeling word sequence to modeling complex and graphical text interaction, we propose a Quantum Probability-inspired Graph Attention NeTwork (QPGAT) by combining quantum probability and graph attention mechanism in a unified framework. Specifically, a text interaction graph is firstly constructed to describe the complex interaction of text pieces. Then QPGAT models each text node as a particle in a superposition state and each node’s neighborhood in the graph as a mixed system in a mixed state to learn interaction-aware text node representations. We apply QPGAT to the two important and complex NLP tasks, emotion–cause pair extraction and joint dialog act recognition and sentiment classification. Experiment results show that QPGAT is competitive compared with the state-of-the-art methods on the two complex NLP tasks, demonstrating the effectiveness of QPGAT. Moreover, QPGAT can also provide a reasonable post-hoc explanation about the model decision process for emotion–cause pair extraction.
科研通智能强力驱动
Strongly Powered by AbleSci AI