Abstract
Despite much interest, physics knowledge discovery from experiment data remains largely a manual trial-and-error process. This paper proposes neural differential equation embedding (NeuraDiff), an end-to-end approach to learn a physics model characterized by a set of partial differential equations directly from experiment data. The key idea is the integration of two neural networks – one recognition net extracting the values of physics model variables from experimental data, and the other neural differential equation net simulating the temporal evolution of the physics model. Learning is completed by matching the outcomes of the two neural networks. We apply NeuraDiff to the real-world application of tracking and learning the physics model of nano-scale defects in crystalline materials under irradiation and high temperature. Experimental results demonstrate that NeuraDiff produces highly accurate tracking results while capturing the correct dynamics of nano-scale defects.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Allen, S.M., Cahn, J.W.: Ground state structures in ordered binary alloys with second neighbor interactions. Acta Metallurgica 20(3), 423–433 (1972)
Amos, B., Kolter, J.Z.: Optnet: differentiable optimization as a layer in neural networks. In: International Conference on Machine Learning, pp. 136–145 (2017)
Attia, P.M., et al.: Closed-loop optimization of fast-charging protocols for batteries with machine learning. Nature 578(7795), 397–402 (2020)
Azimi, J., Fern, X.Z., Fern, A.: Budgeted optimization with constrained experiments. J. Artif. Int. Res. 56(1), 119–152 (2016)
Beck, C., Weinan, E., Jentzen, A.: Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. J. Nonlinear Sci. 29(4), 1563–1619 (2019). https://doi.org/10.1007/s00332-018-9525-3
de Bezenac, E., Pajot, A., Gallinari, P.: Deep learning for physical processes: incorporating prior scientific knowledge. In: International Conference on Learning Representations (2018)
Cahn, J.W., Hilliard, J.E.: Free energy of a nonuniform system. I. Interfacial free energy. J. Chem. Phys. 28(2), 258–267 (1958)
Chen, R.T.Q., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Adv. Neural Inf. Process. Syst. 31, 6571–6583 (2018)
Chen, Z., Zhang, J., Arjovsky, M., Bottou, L.: Symplectic recurrent neural networks. In: 8th International Conference on Learning Representations, ICLR (2020)
Demeester, T.: System identification with time-aware neural sequence models. arXiv preprint arXiv:1911.09431 (2019)
Devulapalli, P., Dilkina, B., Xue, Y.: Embedding conjugate gradient in learning random walks for landscape connectivity modeling in conservation. In: Bessiere, C. (ed.) Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, IJCAI-20, pp. 4338–4344. International Joint Conferences on Artificial Intelligence Organization (2020)
Ermon, S., Le Bras, R., Gomes, C.P., Selman, B., van Dover, R.B.: SMT-aided combinatorial materials discovery. In: Cimatti, A., Sebastiani, R. (eds.) SAT 2012. LNCS, vol. 7317, pp. 172–185. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31612-8_14
Ferber, A., Wilder, B., Dilkina, B., Tambe, M.: Mipaal: mixed integer program as a layer. In: AAAI, pp. 1504–1511 (2020)
Finzi, M., Wang, K.A., Wilson, A.G.: Simplifying Hamiltonian and Lagrangian neural networks via explicit constraints. Adv. Neural Inf. Process. Syst. 33, 13581 (2020)
Gomes, C.P., et al.: Crystal: a multi-agent AI system for automated mapping of materials’ crystal structures. MRS Commun. 9(2), 600–608 (2019)
Greydanus, S., Dzamba, M., Yosinski, J.: Hamiltonian neural networks. Adv. Neural Inf. Process. Syst. 32, 15379–15389 (2019)
Han, J., Jentzen, A., Weinan, E.: Solving high-dimensional partial differential equations using deep learning. Proc. Nat. Acad. Sci. 115(34), 8505–8510 (2018)
Hu, Y., et al.: Difftaichi: differentiable programming for physical simulation. In: 8th International Conference on Learning Representations, ICLR (2020)
Jin, W., Barzilay, R., Jaakkola, T.: Junction tree variational autoencoder for molecular graph generation. In: International Conference on Machine Learning (2018)
Jin, W., Yang, K., Barzilay, R., Jaakkola, T.: Learning multimodal graph-to-graph translation for molecule optimization. In: International Conference on Learning Representations (2018)
Khalil, E., Dai, H., Zhang, Y., Dilkina, B., Song, L.: Learning combinatorial optimization algorithms over graphs. In: Guyon, I., et al. (eds.) Adv. Neural Inf. Process. Syst. 30, 6348–6358 (2017)
Kidger, P., Morrill, J., Foster, J., Lyons, T.: Neural controlled differential equations for irregular time series. arXiv:2005.08926 (2020)
Kusner, M.J., Paige, B., Hernández-Lobato, J.M.: Grammar variational autoencoder. In: International Conference on Machine Learning, pp. 1945–1954 (2017)
Long, Z., Lu, Y., Ma, X., Dong, B.: Pde-net: Learning pdes from data. In: International Conference on Machine Learning, pp. 3208–3216 (2018)
Lu, Y., Zhong, A., Li, Q., Dong, B.: Beyond finite layer neural networks: bridging deep architectures and numerical differential equations. In: International Conference on Machine Learning, pp. 3276–3285 (2018)
Lutter, M., Ritter, C., Peters, J.: Deep Lagrangian networks: using physics as model prior for deep learning. In: International Conference on Learning Representations (2018)
Ma, T., Chen, J., Xiao, C.: Constrained generation of semantically valid graphs via regularizing variational autoencoders. In: Advances in Neural Information Processing Systems, pp. 7113–7124 (2018)
Ma, T., Xiao, C., Zhou, J., Wang, F.: Drug similarity integration through attentive multi-view graph auto-encoders. In: Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI, pp. 3477–3483 (7 2018)
Matsubara, T., Ishikawa, A., Yaguchi, T.: Deep energy-based modeling of discrete-time physics. In: Advances in Neural Information Processing Systems 33 (NeurIPS2020) (2020)
Millett, P.C., El-Azab, A., Rokkam, S., Tonks, M., Wolf, D.: Phase-field simulation of irradiated metals: part i: void kinetics. Comput. Mater. Sci. 50(3), 949–959 (2011)
Niu, T., et al.: Recent studies on void shrinkage in metallic materials subjected to in situ heavy ion irradiations. JOM 72(11), 4008–4016 (2020). https://doi.org/10.1007/s11837-020-04358-3
Portwood, G.D., et al.: Turbulence forecasting via neural ode. arXiv preprint arXiv:1911.05180 (2019)
Raza, A., Sturluson, A., Simon, C.M., Fern, X.: Message passing neural networks for partial charge assignment to metal-organic frameworks. J. Phys. Chem. C 124(35), 19070–19082 (2020)
Roberts, G., Haile, S.Y., Sainju, R., Edwards, D.J., Hutchinson, B., Zhu, Y.: Deep learning for semantic segmentation of defects in advanced stem images of steels. Sci. Rep. 9(1), 1–12 (2019)
Ronneberger, O., Fischer, P., Brox, T.: U-net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28
Sæmundsson, S., Terenin, A., Hofmann, K., Deisenroth, M.P.: Variational integrator networks for physically structured embeddings. In: The 23rd International Conference on Artificial Intelligence and Statistics, vol. 108, pp. 3078–3087 (2020)
Sanchez-Gonzalez, A., Godwin, J., Pfaff, T., Ying, R., Leskovec, J., Battaglia, P.W.: Learning to simulate complex physics with graph networks. In: International Conference on Machine Learning (2020)
Stewart, R., Ermon, S.: Label-free supervision of neural networks with physics and domain knowledge. In: 31 AAAI Conference on Artificial Intelligence (2017)
Tong, Y., Xiong, S., He, X., Pan, G., Zhu, B.: Symplectic neural networks in Taylor series form for Hamiltonian systems. ArXiv abs/2005.04986 (2020)
Zhong, Y.D., Dey, B., Chakraborty, A.: Symplectic ode-net: learning hamiltonian dynamics with control. In: 8th International Conference on Learning Representations, ICLR (2020)
Acknowledgements
This research was supported by NSF grants IIS-1850243, CCF-1918327. We thank anonymous reviewers for their comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Xue, Y., Nasim, M., Zhang, M., Fan, C., Zhang, X., El-Azab, A. (2021). Physics Knowledge Discovery via Neural Differential Equation Embedding. In: Dong, Y., Kourtellis, N., Hammer, B., Lozano, J.A. (eds) Machine Learning and Knowledge Discovery in Databases. Applied Data Science Track. ECML PKDD 2021. Lecture Notes in Computer Science(), vol 12979. Springer, Cham. https://doi.org/10.1007/978-3-030-86517-7_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-86517-7_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86516-0
Online ISBN: 978-3-030-86517-7
eBook Packages: Computer ScienceComputer Science (R0)