Identifying transition states—saddle points on the potential energy surface connecting reactant and product minima—is central to predicting kinetic barriers and understanding chemical reaction mechanisms. In this work, we train a fully differentiable equivariant neural network potential, NewtonNet, on thousands of organic reactions and derive the analytical Hessians. By reducing the computational cost by several orders of magnitude relative to the density functional theory (DFT) ab initio source, we can afford to use the learned Hessians at every step for the saddle point optimizations. We show that the full machine learned (ML) Hessian robustly finds the transition states of 240 unseen organic reactions, even when the quality of the initial guess structures are degraded, while reducing the number of optimization steps to convergence by 2–3× compared to the quasi-Newton DFT and ML methods. All data generation, NewtonNet model, and ML transition state finding methods are available in an automated workflow.
Yuan, Eric C.-Y., et al. "Analytical ab initio hessian from a deep learning potential for transition state optimization." Nature Communications, vol. 15, no. 1, Oct. 2024. https://doi.org/10.1038/s41467-024-52481-5
Yuan, Eric C.-Y., Kumar, Anup, Guan, Xingyi, Hermes, Eric D., Rosen, Andrew S., Zádor, Judit, Head-Gordon, Teresa, & Blau, Samuel M. (2024). Analytical ab initio hessian from a deep learning potential for transition state optimization. Nature Communications, 15(1). https://doi.org/10.1038/s41467-024-52481-5
Yuan, Eric C.-Y., Kumar, Anup, Guan, Xingyi, et al., "Analytical ab initio hessian from a deep learning potential for transition state optimization," Nature Communications 15, no. 1 (2024), https://doi.org/10.1038/s41467-024-52481-5
@article{osti_2466264,
author = {Yuan, Eric C.-Y. and Kumar, Anup and Guan, Xingyi and Hermes, Eric D. and Rosen, Andrew S. and Zádor, Judit and Head-Gordon, Teresa and Blau, Samuel M.},
title = {Analytical ab initio hessian from a deep learning potential for transition state optimization},
annote = {Identifying transition states—saddle points on the potential energy surface connecting reactant and product minima—is central to predicting kinetic barriers and understanding chemical reaction mechanisms. In this work, we train a fully differentiable equivariant neural network potential, NewtonNet, on thousands of organic reactions and derive the analytical Hessians. By reducing the computational cost by several orders of magnitude relative to the density functional theory (DFT) ab initio source, we can afford to use the learned Hessians at every step for the saddle point optimizations. We show that the full machine learned (ML) Hessian robustly finds the transition states of 240 unseen organic reactions, even when the quality of the initial guess structures are degraded, while reducing the number of optimization steps to convergence by 2–3× compared to the quasi-Newton DFT and ML methods. All data generation, NewtonNet model, and ML transition state finding methods are available in an automated workflow.},
doi = {10.1038/s41467-024-52481-5},
url = {https://www.osti.gov/biblio/2466264},
journal = {Nature Communications},
issn = {ISSN 2041-1723},
number = {1},
volume = {15},
place = {United States},
publisher = {Nature Publishing Group},
year = {2024},
month = {10}}
Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)
Sponsoring Organization:
USDOE; USDOE Laboratory Directed Research and Development (LDRD) Program; USDOE Office of Science (SC), Basic Energy Sciences (BES). Chemical Sciences, Geosciences & Biosciences Division (CSGB); USDOE Office of Science (SC), Basic Energy Sciences (BES). Scientific User Facilities (SUF)