化学
药品
计算生物学
立体化学
药理学
医学
生物
作者
Feisheng Zhong,Rongcai Yue,Jin-xing Chen,Dingyan Wang,Schork Ma,Shiming Chen
标识
DOI:10.1021/acs.jmedchem.5c00271
摘要
In the post-GPT era, Llama-Gram represents a promising advancement in AI-driven chemical drug discovery, grounded in the chemical principle that molecular structure determines properties. This folding-based end-to-end framework seeks to address the hallucination issues of traditional large language models by integrating protein folding embeddings, graph-based molecular representations, and uncertainty estimation to better capture the structural complexities of protein–ligand interactions. By leveraging the frozen-gradient ESMFold model and a Graph Transformer variant, Llama-Gram aims to enhance predictive accuracy and reliability through grouped-query attention and a Gram layer inspired by support points theory. By incorporating protein folding information, the model demonstrates competitive performance against state-of-the-art approaches such as Transformer CPI 2.0 and Graph-DTA, offering improvements in compound–target interaction. Llama-Gram provides a scalable and innovative chemical theory that could contribute to accelerating the chemical drug discovery process.
科研通智能强力驱动
Strongly Powered by AbleSci AI