Die u:cris Detailansicht:

Optimizing the architecture of Behler-Parrinello neural network potentials

Autor(en)
Lukáš Kývala, Christoph Dellago
Abstrakt

The architecture of neural network potentials is typically optimized at the beginning of the training process and remains unchanged throughout. Here, we investigate the accuracy of Behler-Parrinello neural network potentials for varying training set sizes. Using the QM9 and 3BPA datasets, we show that adjusting the network architecture according to the training set size improves the accuracy significantly. We demonstrate that both an insufficient and an excessive number of fitting parameters can have a detrimental impact on the accuracy of the neural network potential. Furthermore, we investigate the influences of descriptor complexity, neural network depth, and activation function on the model’s performance. We find that for the neural network potentials studied here, two hidden layers yield the best accuracy and that unbounded activation functions outperform bounded ones.

Organisation(en)
Computergestützte Physik und Physik der Weichen Materie
Journal
Journal of Chemical Physics
Band
159
Anzahl der Seiten
8
ISSN
0021-9606
DOI
https://doi.org/10.1063/5.0167260
Publikationsdatum
09-2023
Peer-reviewed
Ja
ÖFOS 2012
102019 Machine Learning, 103043 Computational Physics, 103029 Statistische Physik
ASJC Scopus Sachgebiete
Allgemeine Physik und Astronomie, Physical and Theoretical Chemistry
Link zum Portal
https://ucrisportal.univie.ac.at/de/publications/a14cd157-f5fc-48bb-9314-543b34703f41