清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

An explainable knowledge distillation method with XGBoost for ICU mortality prediction

计算机科学 人工智能 特征工程 机器学习 深度学习 预测建模 多元统计 任务(项目管理) 特征(语言学) 数据挖掘 语言学 哲学 管理 经济
作者
Mucan Liu,Chonghui Guo,Sijia Guo
出处
期刊:Computers in Biology and Medicine [Elsevier]
卷期号:152: 106466-106466 被引量:24
标识
DOI:10.1016/j.compbiomed.2022.106466
摘要

Mortality prediction is an important task in intensive care unit (ICU) for quantifying the severity of patients’ physiological condition. Currently, scoring systems are widely applied for mortality prediction, while the performance is unsatisfactory in many clinical conditions due to the non-specificity and linearity characteristics of the used model. As the availability of the large volume of data recorded in electronic health records (EHRs), deep learning models have achieved state-of-art predictive performance. However, deep learning models are hard to meet the requirement of explainability in clinical conditions. Hence, an explainable Knowledge Distillation method with XGBoost (XGB-KD) is proposed to improve the predictive performance of XGBoost while supporting better explainability. In this method, we first use outperformed deep learning teacher models to learn the complex patterns hidden in high-dimensional multivariate time series data. Then, we distill knowledge from soft labels generated by the ensemble of teacher models to guide the training of XGBoost student model, whose inputs are meaningful features obtained from feature engineering. Finally, we conduct model calibration to obtain predicted probabilities reflecting the true posterior probabilities and use SHapley Additive exPlanations (SHAP) to obtain insights about the trained model. We conduct comprehensive experiments on MIMIC-III dataset to evaluate our method. The results demonstrate that our method achieves better predictive performance than vanilla XGBoost, deep learning models and several state-of-art baselines from related works. Our method can also provide intuitive explanations. Our method is useful for improving the predictive performance of XGBoost by distilling knowledge from deep learning models and can provide meaningful explanations for predictions.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Chere20200628完成签到 ,获得积分10
6秒前
ping发布了新的文献求助30
12秒前
jiahao完成签到,获得积分10
18秒前
wfw完成签到 ,获得积分10
25秒前
Linseed发布了新的文献求助10
32秒前
科研通AI6.1应助Linseed采纳,获得10
46秒前
中華人民共和完成签到,获得积分10
47秒前
苗条伟帮完成签到,获得积分10
50秒前
Linseed完成签到,获得积分10
52秒前
wwf完成签到 ,获得积分10
57秒前
传奇3应助王登采纳,获得10
59秒前
笑傲完成签到,获得积分10
1分钟前
1分钟前
RuiBigHead发布了新的文献求助20
1分钟前
俏皮的凝云完成签到 ,获得积分10
1分钟前
1分钟前
隐形的谷槐完成签到 ,获得积分10
1分钟前
orixero应助忆墨浅琳采纳,获得10
1分钟前
1分钟前
迪迪发布了新的文献求助10
2分钟前
2分钟前
忆墨浅琳发布了新的文献求助10
2分钟前
乐乐应助迪迪采纳,获得10
2分钟前
小橘子吃傻子完成签到,获得积分10
2分钟前
鱼鱼完成签到,获得积分20
2分钟前
猫车高手完成签到 ,获得积分10
2分钟前
guoguo1119完成签到 ,获得积分10
2分钟前
2分钟前
Ren完成签到 ,获得积分10
2分钟前
大医仁心完成签到 ,获得积分10
3分钟前
Axel完成签到,获得积分10
3分钟前
852应助科研通管家采纳,获得10
3分钟前
3分钟前
天天只会睡大觉完成签到 ,获得积分10
3分钟前
青衫完成签到 ,获得积分10
4分钟前
天天开心完成签到 ,获得积分10
4分钟前
Zsting发布了新的文献求助30
5分钟前
英姑应助Zsting采纳,获得10
5分钟前
5分钟前
6分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Kinesiophobia : a new view of chronic pain behavior 2000
Psychology and Work Today 1000
Research for Social Workers 1000
Mastering New Drug Applications: A Step-by-Step Guide (Mastering the FDA Approval Process Book 1) 800
Signals, Systems, and Signal Processing 510
Discrete-Time Signals and Systems 510
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5900709
求助须知:如何正确求助?哪些是违规求助? 6744430
关于积分的说明 15746413
捐赠科研通 5023822
什么是DOI,文献DOI怎么找? 2705287
邀请新用户注册赠送积分活动 1653007
关于科研通互助平台的介绍 1600217