对偶(语法数字)
基础(拓扑)
计算机科学
人工智能
机器学习
数学
艺术
数学分析
文学类
作者
Jia Mai,Wei Fan,Wei He,Hai Huang,Hailong Zhu
出处
期刊:Electronics
[Multidisciplinary Digital Publishing Institute]
日期:2024-11-06
卷期号:13 (22): 4358-4358
标识
DOI:10.3390/electronics13224358
摘要
Explainable artificial intelligence (XAI) is crucial in education for making educational technologies more transparent and trustworthy. In the domain of student performance prediction, both the results and the processes need to be recognized by experts, making the requirement for explainability very high. The belief rule base (BRB) is a hybrid-driven method for modeling complex systems that integrates expert knowledge with transparent reasoning processes, thus providing good explainability. However, class imbalances in student grades often lead models to ignore minority samples, resulting in inaccurate assessments. Additionally, BRB models face the challenge of losing explainability during the optimization process. Therefore, an explainable student performance prediction method based on dual-level progressive classification BRB (DLBRB-i) has been proposed. Principal component regression (PCR) is used to select key features, and models are constructed based on selected metrics. The BRB’s first layer classifies data broadly, while the second layer refines these classifications for accuracy. By incorporating explainability constraints into the population-based covariance matrix adaptation evolution strategy (P-CMA-ES) optimization process, the explainability of the model is ensured effectively. Finally, empirical analysis using real datasets validates the diagnostic accuracy and explainability of the DLBRB-i model.
科研通智能强力驱动
Strongly Powered by AbleSci AI