An explainable knowledge distillation method with XGBoost for ICU mortality prediction

计算机科学 人工智能 特征工程 机器学习 深度学习 预测建模 多元统计 任务(项目管理) 特征(语言学) 数据挖掘 语言学 哲学 管理 经济
作者
Mucan Liu,Chonghui Guo,Sijia Guo
出处
期刊:Computers in Biology and Medicine [Elsevier]
卷期号:152: 106466-106466 被引量:24
标识
DOI:10.1016/j.compbiomed.2022.106466
摘要

Mortality prediction is an important task in intensive care unit (ICU) for quantifying the severity of patients’ physiological condition. Currently, scoring systems are widely applied for mortality prediction, while the performance is unsatisfactory in many clinical conditions due to the non-specificity and linearity characteristics of the used model. As the availability of the large volume of data recorded in electronic health records (EHRs), deep learning models have achieved state-of-art predictive performance. However, deep learning models are hard to meet the requirement of explainability in clinical conditions. Hence, an explainable Knowledge Distillation method with XGBoost (XGB-KD) is proposed to improve the predictive performance of XGBoost while supporting better explainability. In this method, we first use outperformed deep learning teacher models to learn the complex patterns hidden in high-dimensional multivariate time series data. Then, we distill knowledge from soft labels generated by the ensemble of teacher models to guide the training of XGBoost student model, whose inputs are meaningful features obtained from feature engineering. Finally, we conduct model calibration to obtain predicted probabilities reflecting the true posterior probabilities and use SHapley Additive exPlanations (SHAP) to obtain insights about the trained model. We conduct comprehensive experiments on MIMIC-III dataset to evaluate our method. The results demonstrate that our method achieves better predictive performance than vanilla XGBoost, deep learning models and several state-of-art baselines from related works. Our method can also provide intuitive explanations. Our method is useful for improving the predictive performance of XGBoost by distilling knowledge from deep learning models and can provide meaningful explanations for predictions.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
LYNN发布了新的文献求助10
刚刚
伶俐的尔烟完成签到,获得积分10
1秒前
常大菜鸟小寿完成签到,获得积分10
2秒前
科研通AI6.2应助暮色晚钟采纳,获得10
3秒前
3秒前
3秒前
晨许沫光完成签到 ,获得积分10
3秒前
隐形曼青应助tutounanyisheng采纳,获得10
4秒前
NexusExplorer应助嘿咻采纳,获得10
5秒前
6秒前
王鹏完成签到,获得积分10
7秒前
7秒前
慕烊琪完成签到,获得积分10
8秒前
9秒前
落羽完成签到,获得积分10
9秒前
米花发布了新的文献求助10
9秒前
科研通AI6.2应助是why耶采纳,获得10
9秒前
jiajia发布了新的文献求助10
11秒前
tao完成签到,获得积分10
11秒前
12秒前
落羽发布了新的文献求助10
12秒前
junzhang发布了新的文献求助10
13秒前
Guling完成签到 ,获得积分10
13秒前
CodeCraft应助英俊汝燕采纳,获得10
13秒前
我是老大应助慕烊琪采纳,获得10
14秒前
15秒前
噜啦啦完成签到 ,获得积分10
15秒前
die发布了新的文献求助10
16秒前
飞快的外套完成签到 ,获得积分10
18秒前
18秒前
科研通AI6.2应助妮蝶采纳,获得10
18秒前
18秒前
寻寻完成签到,获得积分10
20秒前
pluto应助顺利的忆文采纳,获得10
20秒前
pluto应助顺利的忆文采纳,获得10
20秒前
机灵柚子应助jiajia采纳,获得20
20秒前
20秒前
21秒前
科研通AI2S应助科研通管家采纳,获得10
21秒前
21秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Kinesiophobia : a new view of chronic pain behavior 2000
Psychology and Work Today 1000
Research for Social Workers 1000
Mastering New Drug Applications: A Step-by-Step Guide (Mastering the FDA Approval Process Book 1) 800
Signals, Systems, and Signal Processing 510
Discrete-Time Signals and Systems 510
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5906528
求助须知:如何正确求助?哪些是违规求助? 6785323
关于积分的说明 15765926
捐赠科研通 5030379
什么是DOI,文献DOI怎么找? 2708529
邀请新用户注册赠送积分活动 1657561
关于科研通互助平台的介绍 1602322