化学
生成语法
人工智能
自然语言处理
计算机科学
作者
Jieyu Lü,Zhengshuai Song,Qiyuan Zhao,Yuanqi Du,Yirui Cao,Haojun Jia,Chenru Duan
摘要
Large language models (LLMs) have shown promise in science, such as structure-property prediction and acting as AI agents, yet their intrinsic knowledge and reasoning capability for scientific discovery remains underexplored. We introduce LLM-EO, an integration of LLMs into evolutionary optimization, and demonstrate its success in optimizing transition metal complexes (TMCs). LLM-EO demonstrates advantages in few-sample learning due to the intrinsic chemical knowledge embedded within LLMs and their ability to leverage entire historical data collected during optimizations. Through natural language instructions, LLM-EO offers enhanced accessibility for multiobjective optimizations, potentially lowering barriers for experimental chemists without extensive programming expertise. As generative models, LLM-EO possesses the capability to propose novel ligands and TMCs with unique chemical properties by amalgamating both internal knowledge and external chemistry data, thus combining the benefits of efficient optimization and generation. With advancements in LLMs, both in their capacity as pretrained foundational models and new strategies in post-training inference, we anticipate broad applications of LLM-EO in chemistry and materials design.
科研通智能强力驱动
Strongly Powered by AbleSci AI