Jump to content

Machine-learned interatomic potential

From Wikipedia, the free encyclopedia
(Redirected from Machine learning potential)

Machine-learned interatomic potentials (MLIPs), or simply machine learning potentials (MLPs), are interatomic potentials constructed by machine learning programs. Beginning in the 1990s, researchers have employed such programs to construct interatomic potentials by mapping atomic structures to their potential energies. These potentials are referred to as MLIPs or MLPs.

Such machine learning potentials promised to fill the gap between density functional theory, a highly accurate but computationally intensive modelling method, and empirically derived or intuitively-approximated potentials, which were far lighter computationally but substantially less accurate. Improvements in artificial intelligence technology heightened the accuracy of MLPs while lowering their computational cost, increasing the role of machine learning in fitting potentials.[1][2]

Machine learning potentials began by using neural networks to tackle low-dimensional systems. While promising, these models could not systematically account for interatomic energy interactions; they could be applied to small molecules in a vacuum, or molecules interacting with frozen surfaces, but not much else – and even in these applications, the models often relied on force fields or potentials derived empirically or with simulations.[1] These models thus remained confined to academia.

Modern neural networks construct highly accurate and computationally light potentials, as theoretical understanding of materials science was increasingly built into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. There exist some nonlocal models, but these have been experimental for almost a decade. For most systems, reasonable cutoff radii enable highly accurate results.[1][3]

Almost all neural networks intake atomic coordinates and output potential energies. For some, these atomic coordinates are converted into atom-centered symmetry functions. From this data, a separate atomic neural network is trained for each element; each atomic network is evaluated whenever that element occurs in the given structure, and then the results are pooled together at the end. This process – in particular, the atom-centered symmetry functions which convey translational, rotational, and permutational invariances – has greatly improved machine learning potentials by significantly constraining the neural network search space. Other models use a similar process but emphasize bonds over atoms, using pair symmetry functions and training one network per atom pair.[1][4]

Other models to learn their own descriptors rather than using predetermined symmetry-dictating functions. These models, called message-passing neural networks (MPNNs), are graph neural networks. Treating molecules as three-dimensional graphs (where atoms are nodes and bonds are edges), the model takes feature vectors describing the atoms as input, and iteratively updates these vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to predict the final potentials. The flexibility of this method often results in stronger, more generalizable models. In 2017, the first-ever MPNN model (a deep tensor neural network) was used to calculate the properties of small organic molecules. Such technology was commercialized, leading to the development of Matlantis in 2022, which extracts properties through both the forward and backward passes.[citation needed]

Gaussian Approximation Potential (GAP)

[edit]

One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP),[5][6][7] which combines compact descriptors of local atomic environments[8] with Gaussian process regression[9] to machine learn the potential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon,[10] Silicon,[11] Phosphorus,[12] and Tungsten,[13] as well as for multicomponent systems such as Ge2Sb2Te5[14] and austenitic stainless steel, Fe7Cr2Ni.[15]

References

[edit]
  1. ^ a b c d Kocer, Emir; Ko, Tsz Wai; Behler, Jorg (2022). "Neural Network Potentials: A Concise Overview of Methods". Annual Review of Physical Chemistry. 73: 163–86. arXiv:2107.03727. Bibcode:2022ARPC...73..163K. doi:10.1146/annurev-physchem-082720-034254. PMID 34982580.
  2. ^ Blank, TB; Brown, SD; Calhoun, AW; Doren, DJ (1995). "Neural network models of potential energy surfaces". Journal of Chemical Physics. 103 (10): 4129–37. Bibcode:1995JChPh.103.4129B. doi:10.1063/1.469597.
  3. ^ Ghasemi, SA; Hofstetter, A; Saha, S; Goedecker, S (2015). "Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network". Physical Review B. 92 (4): 045131. arXiv:1501.07344. Bibcode:2015PhRvB..92d5131G. doi:10.1103/PhysRevB.92.045131.
  4. ^ Behler, J; Parrinello, M (2007). "Generalized neural-network representation of high-dimensional potential-energy surfaces". Physical Review Letters. 148 (14). Bibcode:2007PhRvL..98n6401B. doi:10.1103/PhysRevLett.98.146401. PMID 17501293.
  5. ^ Bartók, Albert P.; Payne, Mike C.; Kondor, Risi; Csányi, Gábor (2010-04-01). "Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons". Physical Review Letters. 104 (13): 136403. arXiv:0910.1019. Bibcode:2010PhRvL.104m6403B. doi:10.1103/PhysRevLett.104.136403. PMID 20481899.
  6. ^ Bartók, Albert P.; De, Sandip; Poelking, Carl; Bernstein, Noam; Kermode, James R.; Csányi, Gábor; Ceriotti, Michele (December 2017). "Machine learning unifies the modeling of materials and molecules". Science Advances. 3 (12): e1701816. arXiv:1706.00179. Bibcode:2017SciA....3E1816B. doi:10.1126/sciadv.1701816. ISSN 2375-2548. PMC 5729016. PMID 29242828.
  7. ^ "Gaussian approximation potential – Machine learning atomistic simulation of materials and molecules". Retrieved 2024-04-04.
  8. ^ Bartók, Albert P.; Kondor, Risi; Csányi, Gábor (2013-05-28). "On representing chemical environments". Physical Review B. 87 (18): 184115. arXiv:1209.3140. Bibcode:2013PhRvB..87r4115B. doi:10.1103/PhysRevB.87.184115.
  9. ^ Rasmussen, Carl Edward; Williams, Christopher K. I. (2008). Gaussian processes for machine learning. Adaptive computation and machine learning (3. print ed.). Cambridge, Mass.: MIT Press. ISBN 978-0-262-18253-9.
  10. ^ Deringer, Volker L.; Csányi, Gábor (2017-03-03). "Machine learning based interatomic potential for amorphous carbon". Physical Review B. 95 (9): 094203. arXiv:1611.03277. Bibcode:2017PhRvB..95i4203D. doi:10.1103/PhysRevB.95.094203.
  11. ^ Bartók, Albert P.; Kermode, James; Bernstein, Noam; Csányi, Gábor (2018-12-14). "Machine Learning a General-Purpose Interatomic Potential for Silicon". Physical Review X. 8 (4): 041048. arXiv:1805.01568. Bibcode:2018PhRvX...8d1048B. doi:10.1103/PhysRevX.8.041048.
  12. ^ Deringer, Volker L.; Caro, Miguel A.; Csányi, Gábor (2020-10-29). "A general-purpose machine-learning force field for bulk and nanostructured phosphorus". Nature Communications. 11 (1): 5461. Bibcode:2020NatCo..11.5461D. doi:10.1038/s41467-020-19168-z. ISSN 2041-1723. PMC 7596484. PMID 33122630.
  13. ^ Szlachta, Wojciech J.; Bartók, Albert P.; Csányi, Gábor (2014-09-24). "Accuracy and transferability of Gaussian approximation potential models for tungsten". Physical Review B. 90 (10): 104108. Bibcode:2014PhRvB..90j4108S. doi:10.1103/PhysRevB.90.104108.
  14. ^ Mocanu, Felix C.; Konstantinou, Konstantinos; Lee, Tae Hoon; Bernstein, Noam; Deringer, Volker L.; Csányi, Gábor; Elliott, Stephen R. (2018-09-27). "Modeling the Phase-Change Memory Material, Ge 2 Sb 2 Te 5 , with a Machine-Learned Interatomic Potential". The Journal of Physical Chemistry B. 122 (38): 8998–9006. doi:10.1021/acs.jpcb.8b06476. ISSN 1520-6106. PMID 30173522.
  15. ^ Shenoy, Lakshmi; Woodgate, Christopher D.; Staunton, Julie B.; Bartók, Albert P.; Becquart, Charlotte S.; Domain, Christophe; Kermode, James R. (2024-03-22). "Collinear-spin machine learned interatomic potential for ${\mathrm{Fe}}_{7}{\mathrm{Cr}}_{2}\mathrm{Ni}$ alloy". Physical Review Materials. 8 (3): 033804. arXiv:2309.08689. doi:10.1103/PhysRevMaterials.8.033804.