计算机科学
图形
理论计算机科学
顶点(图论)
自编码
特征学习
编码器
人工智能
机器学习
人工神经网络
操作系统
作者
Juntao Zhang,Nanzhou Lin,Zhang Xue-long,Wei Song,Xiandi Yang,Zhiyong Peng
标识
DOI:10.1145/3488560.3498434
摘要
Recently, the topic of learning concept prerequisite relations has gained the attention of many researchers, which is crucial in the learning process for a learner to decide an optimal study order. However, the existing work still ignores three key factors. (1) People's cognitive differences could make a difference for annotating the prerequisite relation between resources (e.g., courses, textbooks) or concepts (e.g., binary tree). (2) The current vertex (resources or concepts) can be affected by the feature of the neighbor vertex in the resource or concept graph. (3) The feature information of the resource graph may affect the concept graph. To integrate the above factors, we propose an end-to-end graph network-based model called Multi-Head Attention Variational Graph Auto-Encoders (MHAVGAE ) to learn the prerequisite relation between concepts via a resource-concept graph. To address the first two problems, we introduce the multi-head attention mechanism to operate and compute the hidden representations of each vertex over the resource-concept graph. Then, we design a gated fusion mechanism to integrate the feature information of the resource and concept graphs to enrich concept content features. Finally, we conduct numerous experiments to demonstrate the effectiveness of the MHAVGAE across multiple widely used metrics compared with the state-of-the-art methods. The experimental results show that the performance of the MHAVGAE almost outperforms all the baseline methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI