计算机科学
图形
话语
编码器
变压器
人工神经网络
接头(建筑物)
构造(python库)
人工智能
理论计算机科学
计算机网络
操作系统
物理
工程类
电压
建筑工程
量子力学
作者
Pengfei Wei,Bi Zeng,Wenxiong Liao
摘要
Intent detection and slot filling are recognized as two very important tasks in a spoken language understanding (SLU) system. In order to model these two tasks at the same time, many joint models based on deep neural networks have been proposed recently and archived excellent results. In addition, graph neural network has made good achievements in the field of vision. Therefore, we combine these two advantages and propose a new joint model with a wheel-graph attention network (Wheel-GAT), which is able to model interrelated connections directly for single intent detection and slot filling. To construct a graph structure for utterances, we create intent nodes, slot nodes, and directed edges. Intent nodes can provide utterance-level semantic information for slot filling, while slot nodes can also provide local keyword information for intent detection. The two tasks promote each other and carry out end-to-end training at the same time. Experiments show that our proposed approach is superior to multiple baselines on ATIS and SNIPS datasets. Besides, we also demonstrate that using bi-directional encoder representation from transformer (BERT) model further boosts the performance of the SLU task.
科研通智能强力驱动
Strongly Powered by AbleSci AI