Macs in Chemistry

Insanely Great Science

Molecular Transformer


When this paper first appeared on the arXiv preprint server "Molecular Transformer - A Model for Uncertainty-Calibrated Chemical Reaction Prediction it generated considerable interest.

Organic synthesis is one of the key stumbling blocks in medicinal chemistry. A necessary yet unsolved step in planning synthesis is solving the forward problem: given reactants and reagents, predict the products. Similar to other work, we treat reaction prediction as a machine translation problem between SMILES strings of reactants-reagents and the products. We show that a multi-head attention Molecular Transformer model outperforms all algorithms in the literature, achieving a top-1 accuracy above 90% on a common benchmark dataset. Our algorithm requires no handcrafted rules, and accurately predicts subtle chemical transformations. Crucially, our model can accurately estimate its own uncertainty, with an uncertainty score that is 89% accurate in terms of classifying whether a prediction is correct. Furthermore, we show that the model is able to handle inputs without reactant-reagent split and including stereochemistry, which makes our method universally applicable.

I just noticed that it had been recently updated.

If you are interested in this exciting area of chemistry you might be interested to know the code is available on GitHub and the trained model is available online

One of the authors, Alpha Lee, is speaking at the 2nd Artificial Intelligence in Chemistry Meeting #AIChem19, 2nd to 3rd September 2019, Fitzwilliam College, Cambridge, UK. You can register for the meeting here if you would like to hear first hand about this technology.


The full lineup of speakers are here. Also remember there are bursaries available for the meeting.

blog comments powered by Disqus