颂歌
人工智能
变压器
机器学习
人工神经网络
纵向磁场
计算机科学
语言模型
训练集
纵向数据
自然语言处理
数据挖掘
工程类
数学
物理
量子力学
电压
应用数学
磁场
电气工程
作者
Yiming Cheng,Hongxiang Hu,Xin Dong,Xiaoran Hao,Yan Li
标识
DOI:10.1016/j.xphs.2024.02.008
摘要
There remains a substantial need for a comprehensive assessment of various natural language processing (NLP) algorithms in longitudinal pharmacokinetic/pharmacodynamic (PK/PD) modeling despite recent advances in machine learning in the space of quantitative pharmacology. We herein investigated the application of the transformer model and further compared the performance among several different NLP models, including long short-term memory (LSTM) and neural-ODE (Ordinary Differential Equation) in analyzing longitudinal PK/PD data using virtual data containing three different regimens. Results suggested that LSTM and neural-ODE, along with their respective variants provide a strong performance when predicting from training-included (seen) regimens, albeit with slight information loss for training-excluded (unseen) regimens. Similarly, as with neural-ODE, the transformer exhibited superior performance in describing time-series PK/PD data. Nonetheless, when extrapolating to unseen regimens, while outlining the general data trends, it encountered difficulties in precisely capturing data fluctuations. Remarkably, a small integration of unseen data into the training dataset significantly bolsters predictive performance for both seen and unseen regimens. Our study marks a pioneering effort in deploying the transformer model for time-series PK/PD analysis and provides a systematic exploration of the currently available NLP models in this field.
科研通智能强力驱动
Strongly Powered by AbleSci AI