Prediction of Chromatographic Retention Time of a Small Molecule from SMILES Representation Using a Hybrid Transformer-LSTM Model

保留时间 变压器 代表(政治) 色谱法 计算机科学 人工智能 化学 工程类 电气工程 电压 政治 政治学 法学
作者
Sargol Mazraedoost,Hadi Sedigh Malekroodi,Petar Žuvela,Myunggi Yi,Jay Liu
出处
期刊:Journal of Chemical Information and Modeling [American Chemical Society]
被引量:2
标识
DOI:10.1021/acs.jcim.5c00167
摘要

Accurate retention time (RT) prediction in liquid chromatography remains a significant consideration in molecular analysis. In this study, we explore the use of a transformer-based language model to predict RTs by treating simplified molecular input line entry system (SMILES) sequences as textual input, an approach that has not been previously utilized in this field. Our architecture combines a pretrained RoBERTa (robustly optimized BERT approach, a variant of BERT) with bidirectional long short-term memory (BiLSTM) networks to predict retention times in reversed-phase high-performance liquid chromatography (RP-HPLC). The METLIN small molecule retention time (SMRT) data set comprising 77,980 small molecules after preprocessing, was encoded using SMILES notation and processed through a tokenizer to enable molecular representation as sequential data. The proposed transformer-LSTM architecture incorporates layer fusion from multiple transformer layers and bidirectional sequence processing, achieving superior performance compared to existing methods with a mean absolute error (MAE) of 26.23 s, a mean absolute percentage error (MAPE) of 3.25%, and R-squared (R2) value of 0.91. The model's explainability was demonstrated through attention visualization, revealing its focus on key molecular features that can influence RT. Furthermore, we evaluated the model's transfer learning capabilities across ten data sets from the PredRet database, demonstrating robust performance across different chromatographic conditions with consistent improvement over previous approaches. Our results suggest that the hybrid model presents a valuable approach for predicting RT in liquid chromatography, with potential applications in metabolomics and small molecule analysis.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
阿卓卓宇宙最可爱完成签到,获得积分20
刚刚
英姑应助qsxy采纳,获得10
2秒前
蓝多多应助有为采纳,获得20
3秒前
3秒前
4秒前
4秒前
4秒前
5秒前
echo完成签到 ,获得积分10
6秒前
MrTStar发布了新的文献求助30
7秒前
7秒前
生命科学的第一推动力完成签到 ,获得积分10
8秒前
9秒前
11秒前
humorlife完成签到,获得积分10
11秒前
loulan完成签到,获得积分10
11秒前
zycdx3906发布了新的文献求助10
12秒前
12秒前
帅气凝云发布了新的文献求助10
14秒前
喵喵666完成签到,获得积分10
15秒前
catherine完成签到,获得积分10
15秒前
wanci应助澳大利亚采纳,获得10
17秒前
桐桐应助白樱恋曲采纳,获得10
17秒前
D调的华丽完成签到,获得积分10
18秒前
是猪毛啊完成签到,获得积分10
19秒前
cc发布了新的文献求助10
20秒前
汉堡包应助可乐采纳,获得10
20秒前
20秒前
科研通AI2S应助帅气凝云采纳,获得10
20秒前
啊啊啊啊啊完成签到,获得积分20
22秒前
22秒前
23秒前
再休息一分钟完成签到,获得积分10
23秒前
24秒前
完美怀亦发布了新的文献求助10
25秒前
26秒前
小团子发布了新的文献求助10
26秒前
27秒前
27秒前
27秒前
高分求助中
【重要!!请各位用户详细阅读此贴】科研通的精品贴汇总(请勿应助) 10000
Semantics for Latin: An Introduction 1055
Plutonium Handbook 1000
Three plays : drama 1000
Psychology Applied to Teaching 14th Edition 600
Robot-supported joining of reinforcement textiles with one-sided sewing heads 600
Cochrane Handbook for Systematic Reviews ofInterventions(current version) 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4101040
求助须知:如何正确求助?哪些是违规求助? 3638835
关于积分的说明 11531360
捐赠科研通 3347581
什么是DOI,文献DOI怎么找? 1839713
邀请新用户注册赠送积分活动 906964
科研通“疑难数据库(出版商)”最低求助积分说明 824156