计算机科学
时间序列
贝叶斯概率
机器学习
系列(地层学)
人工智能
数据挖掘
生物
古生物学
作者
Zhen Cao,Jeanette Poh Wen Jun,Yang Guo,Chandan Gautam,Mila Nambiar,Sing Yi Chia,Nur Nasyitah Mohamed Salim,Lee Sheldon,Hong Choon Oh,Yong Mong Bee,Pavitra Krishnaswamy,Savitha Ramasamy
标识
DOI:10.1109/jbhi.2025.3598718
摘要
Deep learning models are increasingly used for making predictions based on clinical time series data, but model generalization remains a challenge. Continual learning approaches, which preserve representations while learning new distributions, are suitable for addressing this challenge. We propose Continual Bayesian Long Short Term Memory (C-BLSTM), a continual learning algorithm based on the Bayesian LSTM model for domain incremental learning. C-BLSTM continually learns a sequence of tasks by combining architectural pruning, variational inference-based regularization, and coreset replay strategies. In extensive experiments on two public electronic medical record datasets for mortality prediction, we show that C-BLSTM outperforms many state-of-the-art continual learning approaches. Further, we apply the C-BLSTM to two realworld clinical time series datasets for prediction of readmission risk in patients with heart failure and glycated haemoglobin outcomes in patients with type 2 diabetes.First, we show that these datasets exhibit domain incremental characteristics with significant drifts in their marginal distributions and moderate drifts in their conditional distributions. Then, we demonstrate that the C-BLSTM improves generalization in five diverse realworld scenarios spanning temporal, site, device, case mix, and ethnicity shifts, both in terms of performance and reliability of predictions.
科研通智能强力驱动
Strongly Powered by AbleSci AI