Large-scale chemical language representations capture molecular structure and properties

计算机科学 化学空间 人工智能 语言模型 机器学习 变压器 分子图 特征学习 编码器 自然语言处理 自然语言 图形 药物发现 理论计算机科学 化学 操作系统 物理 量子力学 电压 生物化学
作者
Jerret Ross,Brian Belgodere,Vijil Chenthamarakshan,Inkit Padhi,Youssef Mroueh,Payel Das
出处
期刊:Nature Machine Intelligence [Nature Portfolio]
卷期号:4 (12): 1256-1264 被引量:211
标识
DOI:10.1038/s42256-022-00580-7
摘要

Models based on machine learning can enable accurate and fast molecular property predictions, which is of interest in drug discovery and material design. Various supervised machine learning models have demonstrated promising performance, but the vast chemical space and the limited availability of property labels make supervised learning challenging. Recently, unsupervised transformer-based language models pretrained on a large unlabelled corpus have produced state-of-the-art results in many downstream natural language processing tasks. Inspired by this development, we present molecular embeddings obtained by training an efficient transformer encoder model, MoLFormer, which uses rotary positional embeddings. This model employs a linear attention mechanism, coupled with highly distributed training, on SMILES sequences of 1.1 billion unlabelled molecules from the PubChem and ZINC datasets. We show that the learned molecular representation outperforms existing baselines, including supervised and self-supervised graph neural networks and language models, on several downstream tasks from ten benchmark datasets. They perform competitively on two others. Further analyses, specifically through the lens of attention, demonstrate that MoLFormer trained on chemical SMILES indeed learns the spatial relationships between atoms within a molecule. These results provide encouraging evidence that large-scale molecular language models can capture sufficient chemical and structural information to predict various distinct molecular properties, including quantum-chemical properties. Large language models have recently emerged with extraordinary capabilities, and these methods can be applied to model other kinds of sequence, such as string representations of molecules. Ross and colleagues have created a transformer-based model, trained on a large dataset of molecules, which provides good results on property prediction tasks.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
体贴的小susu完成签到,获得积分10
刚刚
1秒前
科研小越发布了新的文献求助10
2秒前
丘比特应助哈哈哈采纳,获得10
2秒前
昏红发布了新的文献求助50
2秒前
TTD发布了新的文献求助10
2秒前
3秒前
3秒前
熬夜的小王应助ohNANANA采纳,获得10
3秒前
无无完成签到 ,获得积分10
3秒前
ccccccp完成签到,获得积分10
4秒前
4秒前
4秒前
5秒前
5秒前
5秒前
大个应助Eurus采纳,获得10
5秒前
坚定以筠发布了新的文献求助10
5秒前
bkagyin应助科研通管家采纳,获得10
5秒前
JamesPei应助科研通管家采纳,获得10
6秒前
大模型应助科研通管家采纳,获得10
6秒前
小二郎应助科研通管家采纳,获得10
6秒前
SciGPT应助科研通管家采纳,获得10
6秒前
英俊的铭应助科研通管家采纳,获得10
6秒前
上官若男应助科研通管家采纳,获得10
6秒前
6秒前
柳絮发布了新的文献求助10
6秒前
7秒前
淡淡宛完成签到 ,获得积分0
7秒前
7秒前
jj发布了新的文献求助10
7秒前
王哥发布了新的文献求助10
8秒前
小蘑菇应助viogriffin采纳,获得10
8秒前
bkagyin应助怪味薯片采纳,获得10
9秒前
9秒前
10秒前
10秒前
调皮小笼包完成签到,获得积分10
11秒前
子车凡发布了新的文献求助30
11秒前
kiminonawa完成签到,获得积分0
11秒前
高分求助中
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 2400
Optimal Transport: A Comprehensive Introduction to Modeling, Analysis, Simulation, Applications 800
Official Methods of Analysis of AOAC INTERNATIONAL 600
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 588
Residual Stress Measurement by X-Ray Diffraction, 2003 Edition HS-784/2003 588
T/CIET 1202-2025 可吸收再生氧化纤维素止血材料 500
Interpretation of Mass Spectra, Fourth Edition 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3947379
求助须知:如何正确求助?哪些是违规求助? 3492548
关于积分的说明 11065777
捐赠科研通 3223461
什么是DOI,文献DOI怎么找? 1781512
邀请新用户注册赠送积分活动 866311
科研通“疑难数据库(出版商)”最低求助积分说明 800289