Artistic representation of a computer server transformed by MOFs. Image: Kevin Jablonka (EPFL).
Artistic representation of a computer server transformed by MOFs. Image: Kevin Jablonka (EPFL).

The technology behind the predictive text used by smartphones, which is also at the core of many artificial intelligence (AI) applications, is called a transformer, a deep-learning algorithm that detects patterns in datasets.

Now, researchers at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland and the Korea Advanced Institute of Science and Technology (KAIST) have created a transformer for metal-organic frameworks (MOFs), a class of porous crystalline materials. By simply combining organic linkers with metal nodes, chemists can synthesize millions of different MOFs with potential applications in energy storage and gas separation. The researchers report the new transformer in a paper in Nature Machine Intelligence.

The ‘MOFtransformer’ is designed to be the ChatGPT for researchers that study MOFs. Its architecture is based on an AI called Google Brain that can process natural language and forms the core of popular language models such as GPT-3, the predecessor to ChatGPT. The central idea behind these models is that they are pre-trained on a large amount of text, so when we start typing on a smartphone, for example, models like this can predict the most likely next word.

“We wanted to explore this idea for MOFs, but instead of giving a word suggestion, we wanted to have it suggest a property,” says Berend Smit, who led the EPFL side of the project. “We pre-trained the MOFTransformer with a million hypothetical MOFs to learn their essential characteristics, which we represented as a sentence. The model was then trained to complete these sentences to give the MOF's correct characteristics.”

The researchers then fine-tuned the MOFTransformer for properties related to hydrogen storage, including the storage capacity of hydrogen, its diffusion coefficient and the band gap of the MOF (an ‘energy barrier’ that determines how easily electrons can move through a material).

This showed that the MOFTransformer could obtain results using far less data than conventional machine-learning methods. “Because of the pre-training, the MOFTtransformer knows already many of the general properties of MOFs; and because of this knowledge, we need less data to train for another property,” says Smit. Moreover, the same model could be used for all properties, unlike in conventional machine learning, where a separate model must be developed for each application.

The MOFTransformer is a game-changer for the study of MOFs, providing faster results with less data and a more comprehensive understanding of the material. The researchers hope that it will pave the way for the development of new MOFs with improved properties for hydrogen storage and other applications.

This story is adapted from material from EPFL, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.