计算机科学
追踪
人工智能
机器学习
特征学习
人机交互
操作系统
作者
Liqing Qiu,Menglin Zhu,Jingcheng Zhou
标识
DOI:10.1109/tlt.2023.3336240
摘要
Knowledge tracing (KT) is essential in intelligent tutoring systems for tracking learners' knowledge states and predicting their future performance. Numerous prevailing KT methods prioritize modeling learners' behavioral patterns in acquiring knowledge and the relationship among interactions. However, due to the sparsity problem, they frequently encounter challenges in effectively uncovering latent contextual features embedded within the learning sequences. This limitation may impose certain constraints on the predictive performance. In light of this concern, this article focuses on extracting latent features from learning sequences to enhance the assessment of knowledge states. Consequently, we design optimized pretraining mechanisms and introduce an enhanced deep KT method, optimized pretraining deep KT (OPKT). In the pretraining phase, the self-supervised learning approach is effectively employed to train comprehensive contextual encodings of the learning sequences. During fine-tuning, the contextual encodings are transferred to the downstream KT model, which then generates the knowledge states and makes predictions. Through our experiments, the superiority of our method over six existing KT models on five publicly available datasets is demonstrated. Furthermore, extensive ablation studies and visualized analysis validate the rationality and effectiveness of every component of the OPKT architecture.
科研通智能强力驱动
Strongly Powered by AbleSci AI