Large-scale chemical language representations capture molecular structure and properties

计算机科学 化学空间 人工智能 语言模型 机器学习 变压器 分子图 特征学习 编码器 自然语言处理 自然语言 图形 药物发现 理论计算机科学 化学 操作系统 物理 量子力学 电压 生物化学
作者
Jerret Ross,Brian Belgodere,Vijil Chenthamarakshan,Inkit Padhi,Youssef Mroueh,Payel Das
出处
期刊:Nature Machine Intelligence [Springer Nature]
卷期号:4 (12): 1256-1264 被引量:332
标识
DOI:10.1038/s42256-022-00580-7
摘要

Models based on machine learning can enable accurate and fast molecular property predictions, which is of interest in drug discovery and material design. Various supervised machine learning models have demonstrated promising performance, but the vast chemical space and the limited availability of property labels make supervised learning challenging. Recently, unsupervised transformer-based language models pretrained on a large unlabelled corpus have produced state-of-the-art results in many downstream natural language processing tasks. Inspired by this development, we present molecular embeddings obtained by training an efficient transformer encoder model, MoLFormer, which uses rotary positional embeddings. This model employs a linear attention mechanism, coupled with highly distributed training, on SMILES sequences of 1.1 billion unlabelled molecules from the PubChem and ZINC datasets. We show that the learned molecular representation outperforms existing baselines, including supervised and self-supervised graph neural networks and language models, on several downstream tasks from ten benchmark datasets. They perform competitively on two others. Further analyses, specifically through the lens of attention, demonstrate that MoLFormer trained on chemical SMILES indeed learns the spatial relationships between atoms within a molecule. These results provide encouraging evidence that large-scale molecular language models can capture sufficient chemical and structural information to predict various distinct molecular properties, including quantum-chemical properties. Large language models have recently emerged with extraordinary capabilities, and these methods can be applied to model other kinds of sequence, such as string representations of molecules. Ross and colleagues have created a transformer-based model, trained on a large dataset of molecules, which provides good results on property prediction tasks.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
友好的南霜完成签到 ,获得积分10
刚刚
灵巧的寄真完成签到 ,获得积分10
刚刚
兮兮完成签到 ,获得积分10
1秒前
开心的西瓜完成签到,获得积分10
1秒前
2秒前
江鑫楷完成签到,获得积分10
2秒前
夏天就是桃子味完成签到,获得积分10
3秒前
拓展完成签到 ,获得积分10
3秒前
wwwww123发布了新的文献求助10
3秒前
狂野白梅完成签到,获得积分10
3秒前
忐忑的邑完成签到,获得积分10
4秒前
木木三完成签到,获得积分10
4秒前
4秒前
Gins完成签到,获得积分10
5秒前
5秒前
5秒前
可靠橘子完成签到,获得积分10
5秒前
5秒前
6秒前
活泼小笼包完成签到,获得积分10
6秒前
傲娇的咖啡豆完成签到,获得积分10
7秒前
量子星尘发布了新的文献求助10
7秒前
8秒前
落后的慕梅完成签到 ,获得积分10
8秒前
Gunpowder完成签到,获得积分10
8秒前
xinyuDuan完成签到,获得积分10
8秒前
饱满跳跳糖完成签到,获得积分10
8秒前
顾矜应助靓丽初蓝采纳,获得10
8秒前
洪悦冰发布了新的文献求助30
8秒前
充电宝应助史莱姆姆采纳,获得10
8秒前
碳土不凡完成签到 ,获得积分0
8秒前
Sindy完成签到,获得积分10
8秒前
云为晓发布了新的文献求助10
9秒前
Rocc完成签到,获得积分10
9秒前
wulin314完成签到,获得积分10
9秒前
asdfzxcv应助材料诚采纳,获得10
10秒前
Lucas应助major采纳,获得10
10秒前
青牛完成签到,获得积分10
10秒前
ding应助火星上新波采纳,获得10
11秒前
11秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Encyclopedia of Reproduction Third Edition 3000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
From Victimization to Aggression 1000
化妆品原料学 1000
小学科学课程与教学 500
Study and Interlaboratory Validation of Simultaneous LC-MS/MS Method for Food Allergens Using Model Processed Foods 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5645248
求助须知:如何正确求助?哪些是违规求助? 4768236
关于积分的说明 15027213
捐赠科研通 4803788
什么是DOI,文献DOI怎么找? 2568456
邀请新用户注册赠送积分活动 1525787
关于科研通互助平台的介绍 1485451