[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
survey

Constructing Neural Network Based Models for Simulating Dynamical Systems

Published: 09 February 2023 Publication History

Abstract

Dynamical systems see widespread use in natural sciences like physics, biology, and chemistry, as well as engineering disciplines such as circuit analysis, computational fluid dynamics, and control. For simple systems, the differential equations governing the dynamics can be derived by applying fundamental physical laws. However, for more complex systems, this approach becomes exceedingly difficult. Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system. In recent years, there has been an increased interest in applying data-driven modeling techniques to solve a wide range of problems in physics and engineering. This article provides a survey of the different ways to construct models of dynamical systems using neural networks. In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome. Based on the reviewed literature and identified challenges, we provide a discussion on promising research areas.

References

[1]
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, et al. 2016. TensorFlow: Large-scale machine learning on heterogeneous distributed systems. arxiv:1603.04467 (2016).
[2]
Maren Awiszus and Bodo Rosenhahn. 2018. Markov chain neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2180–2187.
[3]
Peter W. Battaglia, Jessica B. Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinícius Flores Zambaldi, Mateusz Malinowski, et al. 2018. Relational inductive biases, deep learning, and graph networks. CoRR abs/1806.01261 (2018).
[4]
Peter W. Battaglia, Razvan Pascanu, Matthew Lai, Danilo Jimenez Rezende, and Koray Kavukcuoglu. 2016. Interaction networks for learning about objects, relations and physics. CoRR abs/1612.00222 (2016).
[5]
Atilim Gunes Baydin, Barak A. Pearlmutter, Alexey Andreyevich Radul, and Jeffrey Mark Siskind. 2018. Automatic differentiation in machine learning: A survey. arxiv:1502.05767 [cs, stat] (Feb.2018).
[6]
Jörg Behler. 2015. Constructing high-dimensional neural network potentials: A tutorial review. International Journal of Quantum Chemistry 115, 16 (2015), 1032–1050.
[7]
Jens Behrmann, Will Grathwohl, Ricky T. Q. Chen, David Duvenaud, and Joern-Henrik Jacobsen. 2019. Invertible residual networks. In Proceedings of the 36th International Conference on Machine Learning.Proceedings of Machine Learning Research, Vol. 97, Kamalika Chaudhuri and Ruslan Salakhutdinov (Eds.). PMLR, 573–582.
[8]
Eli Bingham, Jonathan P. Chen, Martin Jankowiak, Fritz Obermeyer, Neeraj Pradhan, Theofanis Karaletsos, Rohit Singh, Paul Szerlip, Paul Horsfall, and Noah D. Goodman. 2019. Pyro: Deep universal probabilistic programming. Journal of Machine Learning Research 20, 1 (2019), 973–978.
[9]
Christopher Bishop. 2006. Pattern Recognition and Machine Learning. Springer-Verlag, New York, NY.
[10]
Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Velickovic. 2021. Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. CoRR abs/2104.13478 (2021).
[11]
Steven L. Brunton, Bernd R. Noack, and Petros Koumoutsakos. 2020. Machine learning for fluid mechanics. Annual Review of Fluid Mechanics 52, 1 (2020), 477–508.
[12]
Keith T. Butler, Daniel W. Davies, Hugh Cartwright, Olexandr Isayev, and Aron Walsh. 2018. Machine learning for molecular and materials science. Nature 559, 7715 (July2018), 547–555.
[13]
François Edouard Cellier. 1991. Continuous System Modeling. Springer Science & Business Media.
[14]
François Edouard Cellier and Ernesto Kofman. 2006. Continuous System Simulation. Springer Science & Business Media.
[15]
Bo Chang, Lili Meng, Eldad Haber, Frederick Tung, and David Begert. 2018. Multi-level residual networks from dynamical systems view. arxiv:1710.10348 [cs, stat] (Feb.2018).
[16]
Michael B. Chang, Tomer Ullman, Antonio Torralba, and Joshua B. Tenenbaum. 2016. A compositional object-based approach to learning physical dynamics. CoRR abs/1612.00341 (2016).
[17]
Zhengping Che, Sanjay Purushotham, Guangyu Li, Bo Jiang, and Yan Liu. 2018. Hierarchical deep generative models for multi-rate multivariate time series. In Proceedings of the 35th International Conference on Machine Learning.Proceedings of Machine Learning Research, Vol. 80, Jennifer Dy and Andreas Krause (Eds.). PMLR, 784–793.
[18]
Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. 2019. Neural ordinary differential equations. arxiv:1806.07366 [cs, stat] (Dec.2019).
[19]
Travers Ching, Daniel S. Himmelstein, Brett K. Beaulieu-Jones, Alexandr A. Kalinin, Brian T. Do, Gregory P. Way, Enrico Ferrero, et al. 2018. Opportunities and obstacles for deep learning in biology and medicine. Journal of the Royal Society Interface 15, 141 (April2018), 20170387.
[20]
Junyoung Chung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron C. Courville, and Yoshua Bengio. 2015. A recurrent latent variable model for sequential data. In Advances in Neural Information Processing Systems, C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett (Eds.), Vol. 28. Curran Associates.
[21]
Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, and Shirley Ho. 2020. Lagrangian neural networks. arxiv:2003.04630 [physics, stat] (July2020).
[22]
Raj Dandekar, Karen Chung, Vaibhav Dixit, Mohamed Tarek, Aslan Garcia-Valadez, Krishna Vishal Vemula, and Chris Rackauckas. 2021. Bayesian neural ordinary differential equations. arxiv:2012.07244 [cs] (March2021).
[23]
Moritz Diehl, H. Georg Bock, Johannes P. Schlöder, Rolf Findeisen, Zoltan Nagy, and Frank Allgöwer. 2002. Real-time optimization and nonlinear model predictive control of processes governed by differential-algebraic equations. Journal of Process Control 12, 4 (2002), 577–585.
[24]
Joshua V. Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matthew D. Hoffman, and Rif A. Saurous. 2017. TensorFlow distributions. CoRR abs/1711.10604 (2017).
[25]
Ján Drgoňa, Javier Arroyo, Iago Cupeiro Figueroa, David Blum, Krzysztof Arendt, Donghun Kim, Enric Perarnau Ollé, et al. 2020. All you need to know about model predictive control for buildings. Annual Reviews in Control 50 (2020), 190–232.
[26]
Jan Drgona, Aaron R. Tuor, Vikas Chandan, and Draguna L. Vrabie. 2020. Physics-constrained deep learning of multi-zone building thermal dynamics. arxiv:2011.05987 [cs.LG] (2020).
[27]
Emilien Dupont, Arnaud Doucet, and Yee Whye Teh. 2019. Augmented neural ODEs. arxiv:1904.01681 [stat.ML] (2019).
[28]
Chris Finlay, Jörn-Henrik Jacobsen, Levon Nurbekyan, and Adam M. Oberman. 2020. How to train your neural ODE: The world of Jacobian and kinetic regularization. arxiv:2002.02798 [stat.ML] (2020).
[29]
Marc Finzi, Ke Alexander Wang, and Andrew Gordon Wilson. 2020. Simplifying Hamiltonian and Lagrangian neural networks via explicit constraints. CoRR abs/2010.13581 (2020).
[30]
Marco Forgione and Dario Piga. 2020. dynoNet: A neural network architecture for learning dynamical systems. arxiv:2006.02250 [cs.LG] (2020).
[31]
Alexander I. J. Forrester and Andy J. Keane. 2009. Recent advances in surrogate-based optimization. Progress in Aerospace Sciences 45, 1-3 (Jan.2009), 50–79.
[32]
Marco Fraccaro, Søren Kaae Sønderby, Ulrich Paquet, and Ole Winther. 2016. Sequential neural models with stochastic layers. arXiv preprint arxiv:1605.07571 (2016).
[33]
Jonathan Friedman and Jason Ghidella. 2006. Using model-based design for automotive systems engineering—Requirements analysis of the power window example. Journal of Passenger Cars: Electronic and Electrical Systems 115 (2006), 516–521.
[34]
Xiang Gao, Farhad Ramezanghorbani, Olexandr Isayev, Justin S. Smith, and Adrian E. Roitberg. 2020. TorchANI: A free and open source PyTorch-based deep learning implementation of the ANI neural network potentials. Journal of Chemical Information and Modeling 60, 7 (2020), 3408–3415.
[35]
Carlos E. García, David M. Prett, and Manfred Morari. 1989. Model predictive control: Theory and practice—A survey. Automatica 25, 3 (1989), 335–348.
[36]
C. W. Gear and O. Osterby. 1984. Solving ordinary differential equations with discontinuities. ACM Transactions on Mathematical Software 10, 1 (Jan.1984), 23–44.
[37]
Daniel Gedon, Niklas Wahlström, Thomas B. Schön, and Lennart Ljung. 2020. Deep state space models for nonlinear system identification. arxiv:2003.14162 [eess.SY] (2020).
[38]
Anubhab Ghosh, Antoine Honoré, Dong Liu, Gustav Eje Henter, and Saikat Chatterjee. 2021. Robust classification using hidden Markov models and mixtures of normalizing flows. CoRR abs/2102.07284 (2021).
[39]
Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. 2017. Neural message passing for quantum chemistry. CoRR abs/1704.01212 (2017).
[40]
Ian Goodfellow, Yoshua Bengio, Aaron Courville, and Yoshua Bengio. 2016. Deep Learning, Vol. 1. MIT Press, Cambridge, MA.
[41]
Samuel Greydanus, Misko Dzamba, and Jason Yosinski. 2019. Hamiltonian neural networks. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. F. d’Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, 15379–15389.
[42]
Batuhan Güler, Alexis Laignelet, and Panos Parpas. 2019. Towards robust and stable deep learning algorithms for forward backward stochastic differential equations. arxiv:1910.11623 [stat.ML] (2019).
[43]
Danijar Hafner, Timothy P. Lillicrap, Ian Fischer, Ruben Villegas, David Ha, Honglak Lee, and James Davidson. 2018. Learning latent dynamics for planning from pixels. CoRR abs/1811.04551 (2018).
[44]
Ernst Hairer and Gerhard Wanner. 1996. Solving Ordinary Differential Equations II: Stiff and Differential-Algebraic Problems. Number 14. Springer-Verlag, Berlin, Germany.
[45]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2015. Deep residual learning for image recognition. arxiv:1512.03385 [cs] (Dec.2015).
[46]
Pashupati Hegde, Markus Heinonen, Harri Lähdesmäki, and Samuel Kaski. 2018. Deep learning with differential Gaussian process flows. arxiv:1810.04066 [cs, stat] (Oct.2018).
[47]
Jeen-Shing Wang and Yi-Chung Chen. 2008. A Hammerstein-W recurrent neural network with universal approximation capability. In Proceedings of the 2008 IEEE International Conference on Systems, Man, and Cybernetics. 1832–1837.
[48]
Junteng Jia and Austin R. Benson. 2019. Neural jump stochastic differential equations. CoRR abs/1905.10403 (2019).
[49]
Weile Jia, Han Wang, Mohan Chen, Denghui Lu, Lin Lin, Roberto Car, Weinan E, and Linfeng Zhang. 2020. Pushing the limit of molecular dynamics with ab initio accuracy to 100 million atoms with machine learning. arxiv:2005.00223 [physics.comp-ph] (2020).
[50]
Bin Jiang, Jun Li, and Hua Guo. 2016. Potential energy surfaces from high fidelity fitting of ab initio points: The permutation invariant polynomial–neural network approach. International Reviews in Physical Chemistry 35, 3 (2016), 479–506.
[51]
Zhihao Jiang, Miroslav Pajic, Rajeev Alur, and Rahul Mangharam. 2014. Closed-loop verification of medical devices with model abstraction and refinement. International Journal on Software Tools for Technology Transfer 16, 2 (April2014), 191–213.
[52]
Pengzhan Jin, Aiqing Zhu, George Em Karniadakis, and Yifa Tang. 2020. Symplectic networks: Intrinsic structure-preserving networks for identifying Hamiltonian systems. CoRR abs/2001.03750 (2020).
[53]
Laurent Valentin Jospin, Wray Buntine, Farid Boussaid, Hamid Laga, and Mohammed Bennamoun. 2020. Hands-on Bayesian neural networks—A tutorial for deep learning users. arxiv:2007.06823 [cs, stat] (July2020).
[54]
Anuj Karpatne, Gowtham Atluri, James H. Faghmous, Michael Steinbach, Arindam Banerjee, Auroop Ganguly, Shashi Shekhar, Nagiza Samatova, and Vipin Kumar. 2017. Theory-guided data science: A new paradigm for scientific discovery from data. IEEE Transactions on Knowledge and Data Engineering 29, 10 (Oct.2017), 2318–2331.
[55]
Jacob Kelly, Jesse Bettencourt, Matthew James Johnson, and David Duvenaud. 2020. Learning differential equations that are easy to solve. arxiv:2007.04504 [cs.LG] (2020).
[56]
Gaëtan Kerschen, Keith Worden, Alexander F. Vakakis, and Jean-Claude Golinval. 2006. Past, present and future of nonlinear system identification in structural dynamics. Mechanical Systems and Signal Processing 20, 3 (2006), 505–592.
[57]
Patrick Kidger, Ricky T. Q. Chen, and Terry Lyons. 2020. “Hey, that’s not an ODE”: Faster ODE adjoints with 12 lines of code. arxiv:2009.09457 [cs, math] (Sept.2020).
[58]
Thomas Kipf, Ethan Fetaya, Kuan-Chieh Wang, Max Welling, and Richard Zemel. 2018. Neural relational inference for interacting systems. arxiv:1802.04687 [stat.ML] (2018).
[59]
Peter E. Kloeden and Eckhard Platen. 1992. Numerical Solution of Stochastic Differential Equations. Springer.
[60]
Ernesto Kofman and Sergio Junco. 2001. Quantized-state systems: A DEVS approach for continuous system simulation. Transactions of the Society for Modeling and Simulation International 18, 3 (2001), 123–132.
[61]
Slawomir Koziel and Anna Pietrenko-Dabrowska. 2020. Basics of Data-Driven Surrogate Modeling. Springer International Publishing, Cham, Switzerland, 23–58.
[62]
R. Krishnan, U. Shalit, and D. Sontag. 2017. Structured inference networks for nonlinear state space models. In Proceedings of the 31st AAAI Conference on Artificial Intelligence (AAAI’17). 2101–2109.
[63]
Rahul G. Krishnan, Uri Shalit, and David Sontag. 2015. Deep Kalman filters. arxiv:1511.05121 [stat.ML] (2015).
[64]
Rahul G. Krishnan, Uri Shalit, and David Sontag. 2016. Structured inference networks for nonlinear state space models. arxiv:1609.09869 [stat.ML] (2016).
[65]
Rahul G. Krishnan, Uri Shalit, and David Sontag. 2016. Structured inference networks for nonlinear state space models. arxiv:1609.09869 [cs, stat] (Dec.2016).
[66]
Andreas Kroll and Horst Schulte. 2014. Benchmark problems for nonlinear system identification and control using soft computing methods: Need and overview. Applied Soft Computing 25 (2014), 496–513.
[67]
Kookjin Lee and Eric J. Parish. 2021. Parameterized neural ordinary differential equations: Applications to computational physics problems. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 477, 2253 (Sept.2021), 20210162.
[68]
I. Lenz, Ross A. Knepper, and A. Saxena. 2015. DeepMPC: Learning deep latent features for model predictive control. In Proceedings of the Conference on Robotics: Science and Systems.
[69]
Randall J. LeVeque. 2007. Finite Difference Methods for Ordinary and Partial Differential Equations: Steady-State and Time-Dependent Problems, Vol. 98. SIAM.
[70]
Xuechen Li, Ting-Kam Leonard Wong, Ricky T. Q. Chen, and David Duvenaud. 2020. Scalable gradients for stochastic differential equations. arxiv:2001.01328 [cs.LG] (2020).
[71]
Yunzhu Li, Jiajun Wu, Russ Tedrake, Joshua B. Tenenbaum, and Antonio Torralba. 2018. Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids. CoRR abs/1810.01566 (2018).
[72]
Yunzhu Li, Jiajun Wu, Jun-Yan Zhu, Joshua B. Tenenbaum, Antonio Torralba, and Russ Tedrake. 2018. Propagation networks for model-based control under partial observation. CoRR abs/1809.11169 (2018).
[73]
Dong Liu, Antoine Honoré, Saikat Chatterjee, and Lars K. Rasmussen. 2019. Powering hidden Markov model by neural network based generative models. arXiv preprint arxiv:1910.05744 (2019).
[74]
Xuanqing Liu, Tesi Xiao, Si Si, Qin Cao, Sanjiv Kumar, and Cho-Jui Hsieh. 2019. Neural SDE: Stabilizing neural ODE networks with stochastic noise. arxiv:1906.02355 [cs.LG] (2019).
[75]
Xuanqing Liu, Tesi Xiao, Si Si, Qin Cao, Sanjiv Kumar, and Cho-Jui Hsieh. 2019. Neural SDE: Stabilizing neural ODE networks with stochastic noise. arxiv:1906.02355 [cs, stat] (June2019).
[76]
Lennart Ljung. 2006. Some aspects of non linear system identification. IFAC Proceedings Volumes 39, 1 (2006), 110–121.
[77]
Michael Lutter, Christian Ritter, and Jan Peters. 2019. Deep Lagrangian networks: Using physics as model prior for deep learning. arxiv:1907.04490 [cs, eess, stat] (July2019).
[78]
J. E. Marsden and M. West. 2001. Discrete mechanics and variational integrators. Acta Numerica 10 (May2001), 357–514.
[79]
Stefano Massaroli, Michael Poli, Michelangelo Bin, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2020. Stable neural flows. arxiv:2003.08063 [cs.LG] (2020).
[80]
Stefano Massaroli, Michael Poli, Jinkyoo Park, Atsushi Yamashita, and Hajime Asama. 2021. Dissecting neural ODEs. arxiv:2002.08071 [cs.LG] (2021).
[81]
D. Masti and A. Bemporad. 2018. Learning nonlinear state-space models using deep autoencoders. In Proceedings of the 2018 IEEE Conference on Decision and Control (CDC’18). 3862–3867.
[82]
Sparsh Mittal and Shraiysh Vaishay. 2019. A survey of techniques for optimizing deep learning on GPUs. Journal of Systems Architecture 99 (Oct.2019), 101635.
[83]
George Montanez, Saeed Amizadeh, and Nikolay Laptev. 2015. Inertial hidden Markov models: Modeling change in multivariate time series. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 29.
[84]
Mehrdad Moradi, Cláudio Gomes, Bentley James Oakes, and Joachim Denil. 2019. Optimizing fault injection in FMI co-simulation. In Proceedings of the 2019 Summer Simulation Conference. 12.
[85]
Kevin P. Murphy. 2012. Machine Learning: A Probabilistic Perspective. MIT Press, Cambridge, MA.
[86]
Mohammed Kyari Mustafa, Tony Allen, and Kofi Appiah. 2019. A comparative review of dynamic neural networks and hidden Markov model methods for mobile on-device speech recognition. Neural Computing and Applications 31, 2 (2019), 891–899.
[87]
Oliver Nelles. 2001. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models. Springer-Verlag, Berlin, Germany.
[88]
Alexander Norcliffe, Cristian Bodnar, Ben Day, Nikola Simidjievski, and Pietro Liò. 2020. On second order behaviour in augmented neural ODEs. arxiv:2006.07220 [cs.LG] (2020).
[89]
Viktor Oganesyan, Alexandra Volokhova, and Dmitry Vetrov. 2020. Stochasticity in neural ODEs: An empirical study. arxiv:2002.09779 [cs, stat] (June2020).
[90]
Olalekan Ogunmolu, Xuejun Gu, Steve Jiang, and Nicholas Gans. 2016. Nonlinear systems identification using deep dynamic neural networks. arxiv:1610.01439 [cs] (Oct.2016).
[91]
Olalekan P. Ogunmolu, Xuejun Gu, Steve B. Jiang, and Nicholas R. Gans. 2016. Nonlinear systems identification using deep dynamic neural networks. CoRR abs/1610.01439 (2016).
[92]
Katharina Ott, Prateek Katiyar, Philipp Hennig, and Michael Tiemann. 2020. When are neural ODE solutions proper ODEs?arxiv:2007.15386 [cs, stat] (July2020).
[93]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, et al. 2019. PyTorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, 8026–8037.
[94]
Ludovic Pintard, Jean-Charles Fabre, Karama Kanoun, Michel Leeman, and Matthieu Roy. 2013. Fault injection in the automotive standard ISO 26262: An initial approach. In Dependable Computing, David Hutchison, Takeo Kanade, Josef Kittler, Jon M. Kleinberg, Friedemann Mattern, John C. Mitchell, Moni Naor, et al. (Eds.), Vol. 7869. Springer, Berlin, Germany, 126–133.
[95]
Alessio Plebe and Giorgio Grasso. 2019. The unbearable shallow understanding of deep learning. Minds and Machines 29, 4 (Dec.2019), 515–553.
[96]
Michael Poli, Stefano Massaroli, Junyoung Park, Atsushi Yamashita, Hajime Asama, and Jinkyoo Park. 2020. Graph neural ordinary differential equations. arxiv:1911.07532 [cs.LG] (2020).
[97]
Tong Qin, Kailiang Wu, and Dongbin Xiu. 2019. Data driven governing equations approximation using deep neural networks. Journal of Computational Physics 395 (Oct.2019), 620–635.
[98]
Meng Qu, Yoshua Bengio, and Jian Tang. 2019. GMNN: Graph Markov neural networks. In Proceedings of the International Conference on Machine Learning. 5241–5250.
[99]
Alessio Quaglino, Marco Gallieri, Jonathan Masci, and Jan Koutník. 2020. SNODE: Spectral discretization of neural ODEs for system identification. arxiv:1906.07038 [cs.NE] (2020).
[100]
R. Rai and C. K. Sahu. 2020. Driven by data or derived through physics? A review of hybrid physics guided machine learning techniques with cyber-physical system (CPS) focus. IEEE Access 8 (2020), 71050–71073.
[101]
M. Raissi, P. Perdikaris, and G. E. Karniadakis. 2019. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics 378 (Feb.2019), 686–707.
[102]
Maziar Raissi, Paris Perdikaris, and George Em Karniadakis. 2018. Multistep neural networks for data-driven discovery of nonlinear dynamical systems. arxiv:1801.01236 [nlin, physics:physics, stat] (Jan.2018).
[103]
Maziar Raissi, Alireza Yazdani, and George Em Karniadakis. 2020. Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations. Science 367, 6481 (Feb.2020), 1026–1030.
[104]
Syama S. Rangapuram, Matthias W. Seeger, Jan Gasthaus, Lorenzo Stella, Yuyang Wang, and Tim Januschowski. 2018. Deep state space models for time series forecasting. In Advances in Neural Information Processing Systems 31, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.). Curran Associates, 7785–7794.
[105]
Saman Razavi, Bryan A. Tolson, and Donald H. Burn. 2012. Review of surrogate modeling in water resources. Water Resources Research 48, 7 (July2012), 1–32.
[106]
Danilo Jimenez Rezende and Shakir Mohamed. 2016. Variational inference with normalizing flows. arxiv:1505.05770 [stat.ML] (2016).
[107]
Danilo Jimenez Rezende, Shakir Mohamed, and Daan Wierstra. 2014. Stochastic backpropagation and approximate inference in deep generative models. In Proceedings of the International Conference on Machine Learning. 1278–1286.
[108]
David Rolnick, Priya L. Donti, Lynn H. Kaack, Kelly Kochanski, Alexandre Lacoste, Kris Sankaran, Andrew Slavin Ross, et al. 2019. Tackling climate change with machine learning. arxiv:1906.05433 [cs, stat] (Nov.2019).
[109]
Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard, Federico Monti, and Michael M. Bronstein. 2020. Temporal graph networks for deep learning on dynamic graphs. CoRR abs/2006.10637 (2020).
[110]
Lars Ruthotto and Eldad Haber. 2018. Deep neural networks motivated by partial differential equations. arxiv:1804.04272 [cs, math, stat] (Dec.2018).
[111]
Lars Ruthotto and Eldad Haber. 2020. Deep neural networks motivated by partial differential equations. Journal of Mathematical Imaging and Vision 62, 3 (April2020), 352–364.
[112]
Alvaro Sanchez-Gonzalez, Victor Bapst, Kyle Cranmer, and Peter W. Battaglia. 2019. Hamiltonian graph networks with ODE integrators. CoRR abs/1909.12790 (2019).
[113]
Alvaro Sanchez-Gonzalez, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, and Peter W. Battaglia. 2020. Learning to simulate complex physics with graph networks. CoRR abs/2002.09405 (2020).
[114]
Alvaro Sanchez-Gonzalez, Nicolas Heess, Jost Tobias Springenberg, Josh Merel, Martin A. Riedmiller, Raia Hadsell, and Peter W. Battaglia. 2018. Graph networks as learnable physics engines for inference and control. CoRR abs/1806.01242 (2018).
[115]
Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2009. The graph neural network model. IEEE Transactions on Neural Networks 20, 1 (2009), 61–80.
[116]
Johan Schoukens and Lennart Ljung. 2019. Nonlinear system identification: A user-oriented roadmap. CoRR abs/1902.00683 (2019).
[117]
M. Schoukens and J. P. Noël. 2017. Three benchmarks addressing open challenges in nonlinear system identification. IFAC-PapersOnLine 50, 1 (2017), 446–451.
[118]
Dieter Schramm, Wildan Lalo, and Michael Unterreiner. 2010. Application of simulators and simulation tools for the functional design of mechatronic systems. Solid State Phenomena 166–167 (Sept.2010), 1–14.
[119]
Kristof T. Schütt, Pieter-Jan Kindermans, Huziel E. Sauceda, Stefan Chmiela, Alexandre Tkatchenko, and Klaus-Robert Müller. 2017. SchNet: A continuous-filter convolutional neural network for modeling quantum interactions. arxiv:1706.08566 [stat.ML] (2017).
[120]
Elliott Skomski, Jan Drgona, and Aaron Tuor. 2020. Physics-informed neural state space models via learning and evolution. arxiv:2011.13497 [cs.NE] (2020).
[121]
Elliott Skomski, Soumya Vasisht, Colby Wight, Aaron Tuor, Jan Drgona, and Draguna Vrabie. 2021. Constrained block nonlinear neural dynamical models. arxiv:2101.01864 [math.DS] (2021).
[122]
B. Sohlberg and E. W. Jacobsen. 2008. Grey box modelling—Branches and experiences. IFAC Proceedings Volumes 41, 2 (2008), 11415–11420.
[123]
Heung-Il Suk, Chong-Yaw Wee, Seong-Whan Lee, and Dinggang Shen. 2016. State-space model with deep learning for functional dynamics estimation in resting-state fMRI. NeuroImage 129 (2016), 292–307.
[124]
Peter Toth, Danilo Jimenez Rezende, Andrew Jaegle, Sébastien Racanière, Aleksandar Botev, and Irina Higgins. 2020. Hamiltonian generative networks. arxiv:1909.13789 [cs, stat] (Feb.2020).
[125]
Oliver T. Unke and Markus Meuwly. 2018. A reactive, scalable, and transferable model for molecular energies from a neural network approach based on local information. Journal of Chemical Physics 148, 24 (2018), 241708.
[126]
Oliver T. Unke and Markus Meuwly. 2019. PhysNet: A neural network for predicting energies, forces, dipole moments, and partial charges. Journal of Chemical Theory and Computation 15, 6 (2019), 3678–3693.
[127]
Felipe A. C. Viana, Christian Gogu, and Raphael T. Haftka. 2010. Making the most out of surrogate models: Tricks of the trade. In Volume 1: 36th Design Automation Conference, Parts A and B. ASMEDC, Montreal, Quebec, Canada, 587–598.
[128]
Laura von Rueden, Sebastian Mayer, Katharina Beckh, Bogdan Georgiev, Sven Giesselbach, Raoul Heese, Birgit Kirsch, et al. 2020. Informed machine learning—A taxonomy and survey of integrating knowledge into learning systems. arxiv:1903.12394 [cs, stat] (Feb.2020).
[129]
Laura von Rueden, Sebastian Mayer, Rafet Sifa, Christian Bauckhage, and Jochen Garcke. 2020. Combining machine learning and simulation to a hybrid modelling approach: Current and future directions. Advances in Intelligent Data Analysis XVIII 12080 (2020), 548–560.
[130]
Jiang Wang, Simon Olsson, Christoph Wehmeyer, Adrià Pérez, Nicholas E. Charron, Gianni de Fabritiis, Frank Noé, and Cecilia Clementi. 2019. Machine learning of coarse-grained molecular dynamics force fields. ACS Central Science 5, 5 (2019), 755–767.
[131]
Sifan Wang, Yujun Teng, and Paris Perdikaris. 2020. Understanding and mitigating gradient pathologies in physics-informed neural networks. arxiv:2001.04536 [cs, math, stat] (Jan.2020).
[132]
Sifan Wang, Xinling Yu, and Paris Perdikaris. 2020. When and why PINNs fail to train: A neural tangent kernel perspective. arxiv:2007.14527 [cs, math, stat] (July2020).
[133]
G. Wanner and E. Hairer. 1991. Solving Ordinary Differential Equations I: Nonstiff Problems, Vol. 1. Springer-Verlag.
[134]
Nicholas Watters, Daniel Zoran, Theophane Weber, Peter Battaglia, Razvan Pascanu, and Andrea Tacchetti. 2017. Visual interaction networks: Learning a physics simulator from video. In Advances in Neural Information Processing Systems, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett (Eds.), Vol. 30. Curran Associates, 4542–4550.
[135]
Paul Westermann and Ralph Evins. 2019. Surrogate modelling for sustainable building design—A review. Energy and Buildings 198 (Sept.2019), 170–186.
[136]
Hao Wu, Andreas Mardt, Luca Pasquali, and Frank Noe. 2018. Deep generative Markov state models. arXiv preprint arxiv:1805.07601 (2018).
[137]
Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S. Yu. 2019. A comprehensive survey on graph neural networks. CoRR abs/1901.00596 (2019).
[138]
Winnie Xu, Ricky T. Q. Chen, Xuechen Li, and David Duvenaud. 2021. Infinitely deep Bayesian neural networks with stochastic differential equations. arxiv:2102.06559 [cs, stat] (Aug.2021).
[139]
Linfeng Zhang, Jiequn Han, Han Wang, Roberto Car, and Weinan E. 2018. Deep potential molecular dynamics: A scalable model with the accuracy of quantum mechanics. Physical Review Letters 120, 14 (April2018), 143001.
[140]
Linfeng Zhang, Jiequn Han, Han Wang, Wissam A. Saidi, Roberto Car, and Weinan E. 2018. End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems. arxiv:1805.09003 [physics.comp-ph] (2018).
[141]
Ziwei Zhang, Peng Cui, and Wenwu Zhu. 2018. Deep learning on graphs: A survey. CoRR abs/1812.04202 (2018).
[142]
Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. 2019. Symplectic ODE-Net: Learning Hamiltonian dynamics with control. CoRR abs/1909.12077 (2019).
[143]
Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. 2020. Graph neural networks: A review of methods and applications. AI Open 1 (2020), 57–81.

Cited By

View all
  • (2024)Artificial Intelligence in Modeling and SimulationAlgorithms10.3390/a1706026517:6(265)Online publication date: 15-Jun-2024
  • (2024)The rise of scientific machine learning: a perspective on combining mechanistic modelling with machine learning for systems biologyFrontiers in Systems Biology10.3389/fsysb.2024.14079944Online publication date: 2-Aug-2024
  • (2024)Comparing a linear transfer function-noise model and a neural network to model boiler bank fouling in a kraft recovery boilerTAPPI Journal10.32964/TJ23.7.37423:7(374-384)Online publication date: 29-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Computing Surveys
ACM Computing Surveys  Volume 55, Issue 11
November 2023
849 pages
ISSN:0360-0300
EISSN:1557-7341
DOI:10.1145/3572825
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 February 2023
Online AM: 16 November 2022
Accepted: 04 October 2022
Revised: 23 September 2022
Received: 02 November 2021
Published in CSUR Volume 55, Issue 11

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Neural ODEs
  2. physics-informed neural networks
  3. physics-based regularization

Qualifiers

  • Survey
  • Refereed

Funding Sources

  • Poul Due Jensen Foundation
  • MADE Digital project
  • Data Model Convergence (DMC) initiative via the Laboratory Directed Research and Development (LDRD) investments at Pacific Northwest National Laboratory (PNNL)
  • Battelle Memorial Institute

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,962
  • Downloads (Last 6 weeks)224
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Artificial Intelligence in Modeling and SimulationAlgorithms10.3390/a1706026517:6(265)Online publication date: 15-Jun-2024
  • (2024)The rise of scientific machine learning: a perspective on combining mechanistic modelling with machine learning for systems biologyFrontiers in Systems Biology10.3389/fsysb.2024.14079944Online publication date: 2-Aug-2024
  • (2024)Comparing a linear transfer function-noise model and a neural network to model boiler bank fouling in a kraft recovery boilerTAPPI Journal10.32964/TJ23.7.37423:7(374-384)Online publication date: 29-Jul-2024
  • (2024)AI Simulation by Digital Twins: Systematic Survey of the State of the Art and a Reference FrameworkProceedings of the ACM/IEEE 27th International Conference on Model Driven Engineering Languages and Systems10.1145/3652620.3688253(401-412)Online publication date: 22-Sep-2024
  • (2024)Trajectory Inference of Unknown Linear Systems Based on Partial States MeasurementsIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2023.334401754:4(2276-2286)Online publication date: Apr-2024
  • (2024)Reservoir Computing for Drone Trajectory Intent Prediction: A Physics Informed ApproachIEEE Transactions on Cybernetics10.1109/TCYB.2024.337938154:9(4939-4948)Online publication date: Sep-2024
  • (2024)Comparison of Neural Network Training Approaches That Preserve Physical Properties of Cyber-Physical System2024 XIV Brazilian Symposium on Computing Systems Engineering (SBESC)10.1109/SBESC65055.2024.10771819(1-6)Online publication date: 26-Nov-2024
  • (2024)A Novel Physics-Informed Recurrent Neural Network Approach for State Estimation of Autonomous Platforms2024 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN60899.2024.10650856(1-7)Online publication date: 30-Jun-2024
  • (2024)DeformNet: Latent Space Modeling and Dynamics Prediction for Deformable Object Manipulation2024 IEEE International Conference on Robotics and Automation (ICRA)10.1109/ICRA57147.2024.10611243(14770-14776)Online publication date: 13-May-2024
  • (2024)Physics-Informed Neural Networks for Continuum Robots: Towards Fast Approximation of Static Cosserat Rod Theory2024 IEEE International Conference on Robotics and Automation (ICRA)10.1109/ICRA57147.2024.10610742(17293-17299)Online publication date: 13-May-2024
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media