Enhancing Machine Learning Potentials through Transfer Learning across Chemical Elements.

Journal: Journal of chemical information and modeling
Published Date:

Abstract

Machine learning potentials (MLPs) can enable simulations of ab initio accuracy at orders of magnitude lower computational cost. However, their effectiveness hinges on the availability of considerable data sets to ensure robust generalization across chemical space and thermodynamic conditions. The generation of such data sets can be labor-intensive, highlighting the need for innovative methods to train MLPs in data-scarce scenarios. Here, we introduce transfer learning of potential energy surfaces between chemically similar elements. Specifically, we leverage the trained MLP for silicon to initialize and expedite the training of an MLP for germanium. Utilizing classical force field and ab initio data sets, we demonstrate that transfer learning surpasses traditional training from scratch in force prediction, leading to more stable simulations and improved temperature transferability. These advantages become even more pronounced as the training data set size decreases. We also observe positive transfer learning effects for most out-of-target properties. Our findings demonstrate that transfer learning across chemical elements is a promising technique for developing accurate and numerically stable MLPs, particularly in a data-scarce regime.

Authors

  • Sebastien Röcken
    Professorship of Multiscale Modeling of Fluid Materials, Department of Engineering Physics and Computation, TUM School of Engineering and Design, Technical University of Munich, Garching 85748, Germany.
  • Julija Zavadlav
    Professorship of Multiscale Modeling of Fluid Materials, Department of Engineering Physics and Computation, TUM School of Engineering and Design, Technical University of Munich, Garching 85748, Germany.