id: 4
Oct. 22, 2021
Sourcecode

MatDeepLearn

Ornl.Gov | 2021

Fung;

Description:

Graph neural networks (GNNs) have received intense interest as a rapidly expanding class of machine learning models remarkably well-suited for materials applications. To date, a number of successful GNNs have been proposed and demonstrated for systems ranging from crystal stability to electronic property prediction and to surface chemistry and heterogeneous catalysis. However, a consistent benchmark of these models remains lacking, hindering the development and consistent evaluation of new models in the materials field. Here, we present a workflow and testing platform, MatDeepLearn, for quickly and reproducibly assessing and comparing GNNs and other machine learning models. We use this platform to optimize and evaluate a selection of top performing GNNs on several representative datasets in computational materials chemistry. From our investigations we note the importance of hyperparameter selection and find roughly similar performances for the top models once optimized. We identify several strengths in GNNs over conventional models in cases with compositionally diverse datasets and in its overall flexibility with respect to inputs, due to learned rather than defined representations. Meanwhile several weaknesses of GNNs are also observed including high data requirements, and suggestions for further improvement for applications in materials chemistry are discussed.

Citation:

Fung, V., Zhang, J., Juarez, E., & Sumpter, B. G. (2021). Benchmarking graph neural networks for materials chemistry. npj Computational Materials, 7(1), 1-8.

Keywords:

Graph neural neworks, bandgap, materials property prediction,

Curator: hujianju@gmail.com