选择(遗传算法)
转身(生物化学)
语言转向
常识推理
计算机科学
语言模型
人工智能
语言学
哲学
化学
生物化学
作者
Yuandong Wang,Xuhui Ren,Tong Chen,Yuxiao Dong,Quoc Viet Hung Nguyen,Jie Tang
出处
期刊:Cornell University - arXiv
日期:2024-07-25
被引量:1
标识
DOI:10.48550/arxiv.2407.18479
摘要
As a branch of advanced artificial intelligence, dialogue systems are prospering. Multi-turn response selection is a general research problem in dialogue systems. With the assistance of background information and pre-trained language models, the performance of state-of-the-art methods on this problem gains impressive improvement. However, existing studies neglect the importance of external commonsense knowledge. Hence, we design a Siamese network where a pre-trained Language model merges with a Graph neural network (SinLG). SinLG takes advantage of Pre-trained Language Models (PLMs) to catch the word correlations in the context and response candidates and utilizes a Graph Neural Network (GNN) to reason helpful common sense from an external knowledge graph. The GNN aims to assist the PLM in fine-tuning, and arousing its related memories to attain better performance. Specifically, we first extract related concepts as nodes from an external knowledge graph to construct a subgraph with the context response pair as a super node for each sample. Next, we learn two representations for the context response pair via both the PLM and GNN. A similarity loss between the two representations is utilized to transfer the commonsense knowledge from the GNN to the PLM. Then only the PLM is used to infer online so that efficiency can be guaranteed. Finally, we conduct extensive experiments on two variants of the PERSONA-CHAT dataset, which proves that our solution can not only improve the performance of the PLM but also achieve an efficient inference.
科研通智能强力驱动
Strongly Powered by AbleSci AI