BatGPT-Chem: A Foundation Large Model For Chemical Engineering

LLMs have showcased remarkable capabilities in the realm of AI for Science (Ai4Sci) and the chemistry has greatly benefited from the advancement of AI tools. With a strong capacity for learning sequential data like natural language, LLMs offer immense potential. Notably, common representations in chemistry, such as SMILES, are also in the form of sequences. Hence, we propose leveraging LLMs to comprehensively model both chemical sequences and natural language sequences, aiming to tackle diverse chemical tasks. To fulfill this objective, we introduce BatGPT-Chem, a foundational large-scale model with 15B parameters tailored for chemical engineering. First, we unify diverse tasks in chemistry by modeling them through a combination of natural language and SMILES. Next, leveraging this unified modeling approach, we craft prompt templates and generate instructional tuning data using a substantial volume of chemical data. Subsequently, we train BatGPT-15B on over a hundred million instances of instructional tuning data, empowering it to address tasks such as \textbf{Molecule Description}, \textbf{Molecule Design}, \textbf{Retro-synthesis Prediction}, \textbf{Product Inference}, and \textbf{Yield Prediction}. We release our trial platform at \url{https://www.batgpt.net/dapp/chem}..

Medienart:

Preprint

Erscheinungsjahr:

2024

Erschienen:

2024

Enthalten in:

chemRxiv.org - (2024) vom: 17. Apr. Zur Gesamtaufnahme - year:2024

Sprache:

Englisch

Beteiligte Personen:

Yang, Yifei [VerfasserIn]
Shi, Runhan [VerfasserIn]
Li, Zuchao [VerfasserIn]
Jiang, Shu [VerfasserIn]
Yang, Yang [VerfasserIn]
Lu, Bao-Liang [VerfasserIn]
Zhao, Hai [VerfasserIn]

Links:

Volltext [kostenfrei]

Themen:

540
Chemistry

doi:

10.26434/chemrxiv-2024-1p4xt

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

XCH043304273