Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges
<p>The general workflow for integrating prior knowledge into a neural network architecture to create a Physics-Informed Neural Network (PINN).</p> "> Figure 2
<p>The process depicted involves integrating the residuals of the differential equations (DEs) with dynamics <math display="inline"><semantics> <mrow> <mi>N</mi> <mo>(</mo> <mo>)</mo> </mrow> </semantics></math> into the loss function to improve the accuracy of the neural network (NN) model. Here, <math display="inline"><semantics> <mi>ϵ</mi> </semantics></math> represents the acceptable margin of loss. In this context, the primary role of the neural network is to identify the optimal parameters that minimize the specified loss function while adhering to the initial and boundary conditions. Meanwhile, the supplementary cost function ensures that the constraints of the ODEs/PDEs are met, representing the physical information aspect of the neural network.</p> ">
Abstract
:1. Introduction
2. Background
Related Work and Contribution
3. Methodologies
3.1. Feature Engineering
3.2. Model Construction
3.3. Incorporating Physical Laws as an Additional Cost Function
4. Utility of PINNs
- Unknown Parameters: For example, in wind engineering [121], the equations of fluid dynamics are established, but coefficients related to turbulent flow are uncertain.
4.1. Solving PDEs and ODEs with PINNs
- Finite Difference Methods: These discretize the spatial and temporal domains of PDEs and ODEs into a grid, approximating derivatives using finite differences. PINNs are trained to learn the solution directly from data at these grid points, leveraging neural networks’ flexibility and scalability [126].
- Collocation Method: This method enforces differential equations at discrete collocation points throughout the domain. PINNs minimize the residual of the PDEs or ODEs at these points, approximating the solution while satisfying the equations [127].
- Boundary Integral Methods: These represent the solution to PDEs as an integral over the domain’s boundary, reducing dimensionality and simplifying the numerical solution. PINNs trained to learn the boundary integral can solve PDEs with reduced computational cost and memory requirements [128].
- Deep Galerkin Methods: These seek to minimize the residual of PDEs or ODEs over the entire domain. PINNs are trained to solve the equations in a strong sense by incorporating physical principles directly into the loss function [129].
- Time-Stepping Methods: These discretize the temporal domain of time-dependent PDEs and ODEs into time steps, evolving the solution forward using iterative updates. PINNs learn the time evolution of the solution directly from data, utilizing neural networks’ parallelism and scalability [130].
4.2. Inverse Problems
- Forward Model:
- Parameter Estimation:
- Uncertainty Quantification:
4.3. Domain Decomposition
5. Applications
5.1. Fluid Dynamics
5.2. Material Science
5.3. Structural Systems
5.4. Quantum Mechanics
5.5. Geophysics
5.6. Energy Systems
5.7. Oncology
6. Challenges and Limitations
6.1. Data-Related Issues
6.2. Computational Challenges
6.3. Integration of Complex Physics
6.4. Generalization and Robustness
7. Future Directions
7.1. Algorithmic Advancements
7.2. Interdisciplinary Collaborations
- Physics and Machine Learning: Physicists provide insights into physical principles, while machine learning specialists develop and optimize algorithms.
- Engineering and Data Science: Engineers contribute domain-specific knowledge in fields like fluid dynamics and structural mechanics, while data scientists handle data preprocessing, feature engineering, and model refinement.
- Medicine and Healthcare: Medical researchers and healthcare professionals work with machine learning experts to improve personalized medicine, disease diagnosis, and healthcare optimization using PINNs.
- Climate Science and Environmental Engineering: Climate scientists and environmental engineers collaborate with machine learning researchers to address climate change, environmental sustainability, and natural disaster mitigation.
- Materials Science, Nanotechnology, Robotics, and Autonomous Systems: In materials science, PINNs aid in discovering and optimizing new materials. In robotics, they enhance autonomous systems and control strategies. These interdisciplinary efforts leverage synergies, advancing innovation and knowledge across diverse fields.
7.3. Enhancements in Interpretability
- Explainable Model Architectures: Use of sparse NNs and decision trees to clarify relationships between inputs and predictions.
- Feature Importance Analysis: Techniques like feature attribution and sensitivity analysis identify influential features and how input changes affect predictions.
- Visualization Techniques: Saliency maps and activation maximization help users understand model behavior and reasoning.
- Rule Extraction and Symbolic Reasoning: Provide concise representations of learned relationships for better comprehension.
- Domain-Specific Interpretability: Tailored to specific scientific and engineering contexts for relevant insights.
- Model-Agnostic Techniques: Surrogate models and global explanation methods facilitate understanding across different model types.
7.4. Digital Twins
7.5. Large-Scale and Real-Time Applications
8. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Gurney, K. An Introduction to Neural Networks; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Crick, F. The recent excitement about neural networks. Nature 1989, 337, 129–132. [Google Scholar] [CrossRef]
- Emmert-Streib, F. A heterosynaptic learning rule for neural networks. Int. J. Mod. Phys. C 2006, 17, 1501–1520. [Google Scholar] [CrossRef]
- Bishop, C.M. Neural networks and their applications. Rev. Sci. Instruments 1994, 65, 1803–1832. [Google Scholar] [CrossRef]
- Abiodun, O.I.; Jantan, A.; Omolara, A.E.; Dada, K.V.; Mohamed, N.A.; Arshad, H. State-of-the-art in artificial neural network applications: A survey. Heliyon 2018, 4, e00938. [Google Scholar] [CrossRef]
- Cuomo, S.; Di Cola, V.S.; Giampaolo, F.; Rozza, G.; Raissi, M.; Piccialli, F. Scientific machine learning through physics–informed neural networks: Where we are and what’s next. J. Sci. Comput. 2022, 92, 88. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 2019, 378, 686–707. [Google Scholar] [CrossRef]
- Karniadakis, G.E.; Kevrekidis, I.G.; Lu, L.; Perdikaris, P.; Wang, S.; Yang, L. Physics-informed machine learning. Nat. Rev. Phys. 2021, 3, 422–440. [Google Scholar] [CrossRef]
- Mowlavi, S.; Nabi, S. Optimal control of PDEs using physics-informed neural networks. J. Comput. Phys. 2023, 473, 111731. [Google Scholar] [CrossRef]
- Shin, Y.; Darbon, J.; Karniadakis, G.E. On the Convergence of Physics Informed Neural Networks for Linear Second-Order Elliptic and Parabolic Type PDEs. Commun. Comput. Phys. 2020, 28. [Google Scholar] [CrossRef]
- Meng, X.; Li, Z.; Zhang, D.; Karniadakis, G.E. PPINN: Parareal physics-informed neural network for time-dependent PDEs. Comput. Methods Appl. Mech. Eng. 2020, 370, 113250. [Google Scholar] [CrossRef]
- Cai, S.; Mao, Z.; Wang, Z.; Yin, M.; Karniadakis, G.E. Physics-informed neural networks (PINNs) for fluid mechanics: A review. Acta Mech. Sin. 2021, 37, 1727–1738. [Google Scholar] [CrossRef]
- Chen, Y.; Lu, L.; Karniadakis, G.E.; Dal Negro, L. Physics-informed neural networks for inverse problems in nano-optics and metamaterials. Opt. Express 2020, 28, 11618–11633. [Google Scholar] [CrossRef] [PubMed]
- Jordan, M.I.; Mitchell, T.M. Machine learning: Trends, perspectives, and prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef]
- Mohri, M.; Rostamizadeh, A.; Talwalkar, A. Foundations of Machine Learning; MIT Press: Cambridge, MA, USA, 2018. [Google Scholar]
- Yazdani, S.; Tahani, M. Data-driven discovery of turbulent flow equations using physics-informed neural networks. Phys. Fluids 2024, 36, 035107. [Google Scholar] [CrossRef]
- Camporeale, E.; Wilkie, G.J.; Drozdov, A.Y.; Bortnik, J. Data-driven discovery of Fokker-Planck equation for the Earth’s radiation belts electrons using Physics-Informed neural networks. J. Geophys. Res. Space Phys. 2022, 127, e2022JA030377. [Google Scholar] [CrossRef]
- Chen, Z.; Liu, Y.; Sun, H. Physics-informed learning of governing equations from scarce data. Nat. Commun. 2021, 12, 6136. [Google Scholar] [CrossRef]
- Emmert-Streib, F.; Moutari, S.; Dehmer, M. Elements of Data Science, Machine Learning, and Artificial Intelligence Using R; Springer: Berlin/Heidelberg, Germany.
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25. [Google Scholar] [CrossRef]
- Sallam, A.A.; Amery, H.A.; Saeed, A.Y. Iris recognition system using deep learning techniques. Int. J. Biom. 2023, 15, 705–725. [Google Scholar] [CrossRef]
- Sallam, A.A.; Mohammed, B.A.; Abdulbari, M. A Dorsal Hand Vein Recognition System based on Various Machine and Deep Learning Classification Techniques. In Proceedings of the IEEE 2023 3rd International Conference on Computing and Information Technology (ICCIT), Tabuk, Saudi Arabia, 13–14 September 2023; pp. 493–501. [Google Scholar]
- Hinton, G.; Deng, L.; Yu, D.; Dahl, G.E.; Mohamed, A.r.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.; Sainath, T.N.; et al. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Process. Mag. 2012, 29, 82–97. [Google Scholar] [CrossRef]
- Farea, A.; Tripathi, S.; Glazko, G.; Emmert-Streib, F. Investigating the optimal number of topics by advanced text-mining techniques: Sustainable energy research. Eng. Appl. Artif. Intell. 2024, 136, 108877. [Google Scholar] [CrossRef]
- Farea, A.; Emmert-Streib, F. Experimental Design of Extractive Question-Answering Systems:: Influence of Error Scores and Answer Length. J. Artif. Intell. Res. 2024, 80, 87–125. [Google Scholar] [CrossRef]
- Sharma, T.; Farea, A.; Perera, N.; Emmert-Streib, F. Exploring COVID-related relationship extraction: Contrasting data sources and analyzing misinformation. Heliyon 2024, 10, e26973. [Google Scholar] [CrossRef] [PubMed]
- Wu, Y.; Schuster, M.; Chen, Z.; Le, Q.V.; Norouzi, M.; Macherey, W.; Krikun, M.; Cao, Y.; Gao, Q.; Macherey, K.; et al. Google’s neural machine translation system: Bridging the gap between human and machine translation. arXiv 2016, arXiv:1609.08144. [Google Scholar]
- Brunton, S.L.; Noack, B.R.; Koumoutsakos, P. Machine learning for fluid mechanics. Annu. Rev. Fluid Mech. 2020, 52, 477–508. [Google Scholar] [CrossRef]
- Libbrecht, M.W.; Noble, W.S. Machine learning applications in genetics and genomics. Nat. Rev. Genet. 2015, 16, 321–332. [Google Scholar] [CrossRef]
- Lake, B.M.; Salakhutdinov, R.; Tenenbaum, J.B. Human-level concept learning through probabilistic program induction. Science 2015, 350, 1332–1338. [Google Scholar] [CrossRef]
- Alipanahi, B.; Delong, A.; Weirauch, M.T.; Frey, B.J. Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning. Nat. Biotechnol. 2015, 33, 831–838. [Google Scholar] [CrossRef] [PubMed]
- Rafiei, M.H.; Adeli, H. A novel machine learning-based algorithm to detect damage in high-rise building structures. Struct. Des. Tall Spec. Build. 2017, 26, e1400. [Google Scholar] [CrossRef]
- Kim, S.W.; Kim, I.; Lee, J.; Lee, S. Knowledge Integration into deep learning in dynamical systems: An overview and taxonomy. J. Mech. Sci. Technol. 2021, 35, 1331–1342. [Google Scholar] [CrossRef]
- Lawal, Z.K.; Yassin, H.; Lai, D.T.C.; Che Idris, A. Physics-informed neural network (PINN) evolution and beyond: A systematic literature review and bibliometric analysis. Big Data Cogn. Comput. 2022, 6, 140. [Google Scholar] [CrossRef]
- Sharma, P.; Chung, W.T.; Akoush, B.; Ihme, M. A review of physics-informed machine learning in fluid mechanics. Energies 2023, 16, 2343. [Google Scholar] [CrossRef]
- Kashinath, K.; Mustafa, M.; Albert, A.; Wu, J.; Jiang, C.; Esmaeilzadeh, S.; Azizzadenesheli, K.; Wang, R.; Chattopadhyay, A.; Singh, A.; et al. Physics-informed machine learning: Case studies for weather and climate modelling. Philos. Trans. R. Soc. A 2021, 379, 20200093. [Google Scholar] [CrossRef] [PubMed]
- Latrach, A.; Malki, M.L.; Morales, M.; Mehana, M.; Rabiei, M. A critical review of physics-informed machine learning applications in subsurface energy systems. Geoenergy Sci. Eng. 2024, 239, 212938. [Google Scholar] [CrossRef]
- Huang, B.; Wang, J. Applications of physics-informed neural networks in power systems-a review. IEEE Trans. Power Syst. 2022, 38, 572–588. [Google Scholar] [CrossRef]
- Pateras, J.; Rana, P.; Ghosh, P. A taxonomic survey of physics-informed machine learning. Appl. Sci. 2023, 13, 6892. [Google Scholar] [CrossRef]
- Yang, Y.; Perdikaris, P. Adversarial uncertainty quantification in physics-informed neural networks. J. Comput. Phys. 2019, 394, 136–152. [Google Scholar] [CrossRef]
- Zhu, Y.; Zabaras, N.; Koutsourelakis, P.S.; Perdikaris, P. Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data. J. Comput. Phys. 2019, 394, 56–81. [Google Scholar] [CrossRef]
- Sun, L.; Gao, H.; Pan, S.; Wang, J.X. Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data. Comput. Methods Appl. Mech. Eng. 2020, 361, 112732. [Google Scholar] [CrossRef]
- Jagtap, A.D.; Kharazmi, E.; Karniadakis, G.E. Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Comput. Methods Appl. Mech. Eng. 2020, 365, 113028. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Inferring solutions of differential equations using noisy multi-fidelity data. J. Comput. Phys. 2017, 335, 736–746. [Google Scholar] [CrossRef]
- Raissi, M.; Perdikaris, P.; Karniadakis, G.E. Machine learning of linear differential equations using Gaussian processes. J. Comput. Phys. 2017, 348, 683–693. [Google Scholar] [CrossRef]
- Raissi, M.; Karniadakis, G.E. Hidden physics models: Machine learning of nonlinear partial differential equations. J. Comput. Phys. 2018, 357, 125–141. [Google Scholar] [CrossRef]
- Di Lorenzo, D.; Champaney, V.; Marzin, J.; Farhat, C.; Chinesta, F. Physics informed and data-based augmented learning in structural health diagnosis. Comput. Methods Appl. Mech. Eng. 2023, 414, 116186. [Google Scholar] [CrossRef]
- Emmert-Streib, F.; Yang, Z.; Feng, H.; Tripathi, S.; Dehmer, M. An introductory review of deep learning for prediction models with big data. Front. Artif. Intell. 2020, 3, 4. [Google Scholar] [CrossRef]
- Agatonovic-Kustrin, S.; Beresford, R. Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research. J. Pharm. Biomed. Anal. 2000, 22, 717–727. [Google Scholar] [CrossRef] [PubMed]
- Dongare, A.; Kharde, R.; Kachare, A.D. Introduction to artificial neural network. Int. J. Eng. Innov. Technol. (IJEIT) 2012, 2, 189–194. [Google Scholar]
- Zou, J.; Han, Y.; So, S.S. Overview of artificial neural networks. In Artificial Neural Networks: Methods and Applications; Springer: Berlin/Heidelberg, Germany, 2009; pp. 14–22. [Google Scholar] [CrossRef]
- Mehrotra, K.; Mohan, C.K.; Ranka, S. Elements of Artificial Neural Networks; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
- Daw, A.; Karpatne, A.; Watkins, W.D.; Read, J.S.; Kumar, V. Physics-guided neural networks (pgnn): An application in lake temperature modeling. In Knowledge Guided Machine Learning; Chapman and Hall/CRC: Boca Raton, FL, USA, 2022; pp. 353–372. [Google Scholar]
- Zhang, R.; Tao, H.; Wu, L.; Guan, Y. Transfer learning with neural networks for bearing fault diagnosis in changing working conditions. IEEE Access 2017, 5, 14347–14357. [Google Scholar] [CrossRef]
- Pfrommer, J.; Zimmerling, C.; Liu, J.; Kärger, L.; Henning, F.; Beyerer, J. Optimisation of manufacturing process parameters using deep neural networks as surrogate models. Procedia CiRP 2018, 72, 426–431. [Google Scholar] [CrossRef]
- Chao, M.A.; Kulkarni, C.; Goebel, K.; Fink, O. Fusing physics-based and deep learning models for prognostics. Reliab. Eng. Syst. Saf. 2022, 217, 107961. [Google Scholar] [CrossRef]
- Wang, F.; Zhang, Q.J. Knowledge-based neural models for microwave design. IEEE Trans. Microw. Theory Tech. 1997, 45, 2333–2343. [Google Scholar] [CrossRef]
- Yuan, F.G.; Zargar, S.A.; Chen, Q.; Wang, S. Machine learning for structural health monitoring: Challenges and opportunities. Sens. Smart Struct. Technol. Civ. Mech. Aerosp. Syst. 2020 2020, 11379, 1137903. [Google Scholar]
- Li, Z.; Liu, F.; Yang, W.; Peng, S.; Zhou, J. A survey of convolutional neural networks: Analysis, applications, and prospects. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 6999–7019. [Google Scholar] [CrossRef] [PubMed]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Sallam, A.; Gaid, A.S.; Saif, W.Q.; Hana’a, A.; Abdulkareem, R.A.; Ahmed, K.J.; Saeed, A.Y.; Radman, A. Early detection of glaucoma using transfer learning from pre-trained cnn models. In Proceedings of the IEEE 2021 International Conference of Technology, Science and Administration (ICTSA), Taiz, Yemen, 22–24 March 2021; pp. 1–5. [Google Scholar]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Xiang, Q.; Wang, X.; Lai, J.; Lei, L.; Song, Y.; He, J.; Li, R. Quadruplet depth-wise separable fusion convolution neural network for ballistic target recognition with limited samples. Expert Syst. Appl. 2024, 235, 121182. [Google Scholar] [CrossRef]
- Sadoughi, M.; Hu, C. Physics-based convolutional neural network for fault diagnosis of rolling element bearings. IEEE Sens. J. 2019, 19, 4181–4192. [Google Scholar] [CrossRef]
- De Bézenac, E.; Pajot, A.; Gallinari, P. Deep learning for physical processes: Incorporating prior scientific knowledge. J. Stat. Mech. Theory Exp. 2019, 2019, 124009. [Google Scholar] [CrossRef]
- Zhang, R.; Liu, Y.; Sun, H. Physics-guided convolutional neural network (PhyCNN) for data-driven seismic response modeling. Eng. Struct. 2020, 215, 110704. [Google Scholar] [CrossRef]
- Cohen, T.; Welling, M. Group equivariant convolutional networks. In Proceedings of the International Conference on Machine Learning, PMLR, New York, NY, USA, 19–24 June 2016; pp. 2990–2999. [Google Scholar]
- Dieleman, S.; De Fauw, J.; Kavukcuoglu, K. Exploiting cyclic symmetry in convolutional neural networks. In Proceedings of the International Conference on Machine Learning, PMLR, New York, NY, USA, 19–24 June 2016; pp. 1889–1898. [Google Scholar]
- Jia, X.; Willard, J.; Karpatne, A.; Read, J.; Zwart, J.; Steinbach, M.; Kumar, V. Physics guided RNNs for modeling dynamical systems: A case study in simulating lake temperature profiles. In Proceedings of the 2019 SIAM International Conference on Data Mining, SIAM, Calgary, AB, Canada, 2–4 May 2019; pp. 558–566. [Google Scholar]
- Worrall, D.E.; Garbin, S.J.; Turmukhambetov, D.; Brostow, G.J. Harmonic networks: Deep translation and rotation equivariance. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 5028–5037. [Google Scholar]
- Sherstinsky, A. Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys. D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef]
- Sutskever, I.; Martens, J.; Hinton, G.E. Generating text with recurrent neural networks. In Proceedings of the 28th International Conference on Machine Learning (ICML-11), Bellevue, WA, USA, 28 June–2 July 2011; pp. 1017–1024. [Google Scholar]
- Medsker, L.R.; Jain, L. Recurrent neural networks. Des. Appl. 2001, 5, 2. [Google Scholar]
- Pearlmutter. Learning state space trajectories in recurrent neural networks. In Proceedings of the IEEE International 1989 Joint Conference on Neural Networks, Washington, DC, USA, 18–22 June 1989; pp. 365–372. [Google Scholar]
- Mandic, D.P.; Chambers, J. Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2001. [Google Scholar]
- Das, S.; Tariq, A.; Santos, T.; Kantareddy, S.S.; Banerjee, I. Recurrent neural networks (RNNs): Architectures, training tricks, and introduction to influential research. In Machine Learning for Brain Disorders; Springer: Berlin/Heidelberg, Germany, 2023; pp. 117–138. [Google Scholar]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. A review of recurrent neural networks: LSTM cells and network architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef] [PubMed]
- Nascimento, R.G.; Viana, F.A. Fleet prognosis with physics-informed recurrent neural networks. arXiv 2019, arXiv:1901.05512. [Google Scholar]
- Dourado, A.; Viana, F.A. Physics-informed neural networks for corrosion-fatigue prognosis. In Proceedings of the Annual Conference of the PHM Society, Paris, France, 2–5 May 2019; Volume 11. [Google Scholar]
- Dourado, A.D.; Viana, F. Physics-informed neural networks for bias compensation in corrosion-fatigue. In Proceedings of the Aiaa Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; p. 1149. [Google Scholar]
- Long, Y.; She, X.; Mukhopadhyay, S. Hybridnet: Integrating model-based and data-driven learning to predict evolution of dynamical systems. In Proceedings of the Conference on Robot Learning, PMLR, Zürich, Switzerland, 29–31 October 2018; pp. 551–560. [Google Scholar]
- Yu, Y.; Yao, H.; Liu, Y. Structural dynamics simulation using a novel physics-guided machine learning method. Eng. Appl. Artif. Intell. 2020, 96, 103947. [Google Scholar] [CrossRef]
- Lutter, M.; Ritter, C.; Peters, J. Deep lagrangian networks: Using physics as model prior for deep learning. arXiv 2019, arXiv:1907.04490. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Wang, L.; Li, C.; Sun, M. Graph neural networks: A review of methods and applications. AI Open 2020, 1, 57–81. [Google Scholar] [CrossRef]
- Zhou, Y.; Zheng, H.; Huang, X.; Hao, S.; Li, D.; Zhao, J. Graph neural networks: Taxonomy, advances, and trends. ACM Trans. Intell. Syst. Technol. (TIST) 2022, 13, 1–54. [Google Scholar] [CrossRef]
- Veličković, P. Everything is connected: Graph neural networks. Curr. Opin. Struct. Biol. 2023, 79, 102538. [Google Scholar] [CrossRef]
- Ortega, A.; Frossard, P.; Kovačević, J.; Moura, J.M.; Vandergheynst, P. Graph signal processing: Overview, challenges, and applications. Proc. IEEE 2018, 106, 808–828. [Google Scholar] [CrossRef]
- Seo, S.; Meng, C.; Liu, Y. Physics-aware difference graph networks for sparsely-observed dynamics. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Seo, S.; Liu, Y. Differentiable physics-informed graph networks. arXiv 2019, arXiv:1902.02950. [Google Scholar]
- Zhang, G.; He, H.; Katabi, D. Circuit-GNN: Graph neural networks for distributed circuit design. In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 7364–7373. [Google Scholar]
- Mojallal, A.; Lotfifard, S. Multi-physics graphical model-based fault detection and isolation in wind turbines. IEEE Trans. Smart Grid 2017, 9, 5599–5612. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. Available online: https://api.semanticscholar.org/CorpusID:13756489 (accessed on 18 August 2024).
- Knyazev, B.; Taylor, G.W.; Amer, M. Understanding attention and generalization in graph neural networks. Adv. Neural Inf. Process. Syst. 2019, 32. [Google Scholar]
- McClenny, L.; Braga-Neto, U. Self-adaptive physics-informed neural networks using a soft attention mechanism. arXiv 2020, arXiv:2009.04544. [Google Scholar]
- Rodriguez-Torrado, R.; Ruiz, P.; Cueto-Felgueroso, L.; Green, M.C.; Friesen, T.; Matringe, S.; Togelius, J. Physics-informed attention-based neural network for solving non-linear partial differential equations. arXiv 2021, arXiv:2105.07898. [Google Scholar]
- Rodriguez-Torrado, R.; Ruiz, P.; Cueto-Felgueroso, L.; Green, M.C.; Friesen, T.; Matringe, S.; Togelius, J. Physics-informed attention-based neural network for hyperbolic partial differential equations: Application to the Buckley–Leverett problem. Sci. Rep. 2022, 12, 7557. [Google Scholar] [CrossRef]
- Jeddi, A.B.; Shafieezadeh, A. A physics-informed graph attention-based approach for power flow analysis. In Proceedings of the 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA), Pasadena, CA, USA, 13–16 December 2021; pp. 1634–1640. [Google Scholar]
- Altaheri, H.; Muhammad, G.; Alsulaiman, M. Physics-informed attention temporal convolutional network for EEG-based motor imagery classification. IEEE Trans. Ind. Inform. 2022, 19, 2249–2258. [Google Scholar] [CrossRef]
- Che, T.; Liu, X.; Li, S.; Ge, Y.; Zhang, R.; Xiong, C.; Bengio, Y. Deep verifier networks: Verification of deep discriminative models with deep generative models. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtually, 2–9 February 2021; Volume 35, pp. 7002–7010. [Google Scholar]
- Salakhutdinov, R. Learning deep generative models. Annu. Rev. Stat. Its Appl. 2015, 2, 361–385. [Google Scholar] [CrossRef]
- Kingma, D.P.; Welling, M. Auto-encoding variational bayes. arXiv 2013, arXiv:1312.6114. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. Adv. Neural Inf. Process. Syst. 2014, 27. [Google Scholar]
- Warner, J.E.; Cuevas, J.; Bomarito, G.F.; Leser, P.E.; Leser, W.P. Inverse estimation of elastic modulus using physics-informed generative adversarial networks. arXiv 2020, arXiv:2006.05791. [Google Scholar]
- Yang, L.; Zhang, D.; Karniadakis, G.E. Physics-informed generative adversarial networks for stochastic differential equations. SIAM J. Sci. Comput. 2020, 42, A292–A317. [Google Scholar] [CrossRef]
- Gulrajani, I.; Ahmed, F.; Arjovsky, M.; Dumoulin, V.; Courville, A.C. Improved training of wasserstein gans. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar]
- Yang, Y.; Perdikaris, P. Physics-informed deep generative models. arXiv 2018, arXiv:1812.03511. [Google Scholar]
- Hammoud, M.A.E.R.; Titi, E.S.; Hoteit, I.; Knio, O. CDAnet: A Physics-Informed Deep Neural Network for Downscaling Fluid Flows. J. Adv. Model. Earth Syst. 2022, 14, e2022MS003051. [Google Scholar] [CrossRef]
- Psaros, A.F.; Kawaguchi, K.; Karniadakis, G.E. Meta-learning PINN loss functions. J. Comput. Phys. 2022, 458, 111121. [Google Scholar] [CrossRef]
- Soto, Á.M.; Cervantes, A.; Soler, M. Physics-informed neural networks for high-resolution weather reconstruction from sparse weather stations. Open Res. Eur. 2024, 4, 99. [Google Scholar] [CrossRef]
- Davini, D.; Samineni, B.; Thomas, B.; Tran, A.H.; Zhu, C.; Ha, K.; Dasika, G.; White, L. Using physics-informed regularization to improve extrapolation capabilities of neural networks. In Proceedings of the Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021), Vancouver, BC, Canada, 13 December 2021. [Google Scholar]
- He, Q.; Barajas-Solano, D.; Tartakovsky, G.; Tartakovsky, A.M. Physics-informed neural networks for multiphysics data assimilation with application to subsurface transport. Adv. Water Resour. 2020, 141, 103610. [Google Scholar] [CrossRef]
- Owhadi, H. Bayesian numerical homogenization. Multiscale Model. Simul. 2015, 13, 812–828. [Google Scholar] [CrossRef]
- Raissi, M.; Yazdani, A.; Karniadakis, G.E. Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations. Science 2020, 367, 1026–1030. [Google Scholar] [CrossRef]
- Meng, X.; Karniadakis, G.E. A composite neural network that learns from multi-fidelity data: Application to function approximation and inverse PDE problems. J. Comput. Phys. 2020, 401, 109020. [Google Scholar] [CrossRef]
- Yang, L.; Meng, X.; Karniadakis, G.E. B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data. J. Comput. Phys. 2021, 425, 109913. [Google Scholar] [CrossRef]
- Neal, R.M. Bayesian Learning for Neural Networks; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012; Volume 118. [Google Scholar]
- Neal, R.M. MCMC using Hamiltonian dynamics. Handb. Markov Chain Monte Carlo 2011, 2, 2. [Google Scholar]
- Graves, A. Practical variational inference for neural networks. Adv. Neural Inf. Process. Syst. 2011, 24. [Google Scholar]
- Rezende, D.; Mohamed, S. Variational inference with normalizing flows. In Proceedings of the International Conference on Machine Learning, PMLR, Lille, France, 7–9 July 2015; pp. 1530–1538. [Google Scholar]
- Yucesan, Y.A.; Viana, F.A. A physics-informed neural network for wind turbine main bearing fatigue. Int. J. Progn. Health Manag. 2020, 11. [Google Scholar] [CrossRef]
- Fedorov, A.; Perechodjuk, A.; Linke, D. Kinetics-constrained neural ordinary differential equations: Artificial neural network models tailored for small data to boost kinetic model development. Chem. Eng. J. 2023, 477, 146869. [Google Scholar] [CrossRef]
- Pang, H.; Wu, L.; Liu, J.; Liu, X.; Liu, K. Physics-informed neural network approach for heat generation rate estimation of lithium-ion battery under various driving conditions. J. Energy Chem. 2023, 78, 1–12. [Google Scholar] [CrossRef]
- Wang, Y.; Xiong, C.; Wang, Y.; Xu, P.; Ju, C.; Shi, J.; Yang, G.; Chu, J. Temperature state prediction for lithium-ion batteries based on improved physics informed neural networks. J. Energy Storage 2023, 73, 108863. [Google Scholar] [CrossRef]
- Zhang, Z.; Zou, Z.; Kuhl, E.; Karniadakis, G.E. Discovering a reaction–diffusion model for Alzheimer’s disease by combining PINNs with symbolic regression. Comput. Methods Appl. Mech. Eng. 2024, 419, 116647. [Google Scholar] [CrossRef]
- Xiang, Z.; Peng, W.; Zhou, W.; Yao, W. Hybrid finite difference with the physics-informed neural network for solving PDE in complex geometries. arXiv 2022, arXiv:2202.07926. [Google Scholar]
- Hou, J.; Li, Y.; Ying, S. Enhancing PINNs for solving PDEs via adaptive collocation point movement and adaptive loss weighting. Nonlinear Dyn. 2023, 111, 15233–15261. [Google Scholar] [CrossRef]
- Sun, J.; Liu, Y.; Wang, Y.; Yao, Z.; Zheng, X. BINN: A deep learning approach for computational mechanics problems based on boundary integral equations. Comput. Methods Appl. Mech. Eng. 2023, 410, 116012. [Google Scholar] [CrossRef]
- Gao, H.; Zahr, M.J.; Wang, J.X. Physics-informed graph neural Galerkin networks: A unified framework for solving PDE-governed forward and inverse problems. Comput. Methods Appl. Mech. Eng. 2022, 390, 114502. [Google Scholar] [CrossRef]
- Chen, H.; Wu, R.; Grinspun, E.; Zheng, C.; Chen, P.Y. Implicit neural spatial representations for time-dependent PDEs. In Proceedings of the International Conference on Machine Learning, PMLR, Honolulu, HI, USA, 23–29 July 2023; pp. 5162–5177. [Google Scholar]
- Lu, L.; Jin, P.; Karniadakis, G.E. Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv 2019, arXiv:1910.03193. [Google Scholar]
- Pun, G.P.; Batra, R.; Ramprasad, R.; Mishin, Y. Physically informed artificial neural networks for atomistic modeling of materials. Nat. Commun. 2019, 10, 2339. [Google Scholar] [CrossRef] [PubMed]
- Mishin, Y. Machine-learning interatomic potentials for materials science. Acta Mater. 2021, 214, 116980. [Google Scholar] [CrossRef]
- Sagingalieva, A.; Kordzanganeh, M.; Kenbayev, N.; Kosichkina, D.; Tomashuk, T.; Melnikov, A. Hybrid quantum neural network for drug response prediction. Cancers 2023, 15, 2705. [Google Scholar] [CrossRef]
- Shin, J.; Piao, Y.; Bang, D.; Kim, S.; Jo, K. DRPreter: Interpretable anticancer drug response prediction using knowledge-guided graph neural networks and transformer. Int. J. Mol. Sci. 2022, 23, 13919. [Google Scholar] [CrossRef]
- Zhang, E.; Dao, M.; Karniadakis, G.E.; Suresh, S. Analyses of internal structures and defects in materials using physics-informed neural networks. Sci. Adv. 2022, 8, eabk0644. [Google Scholar] [CrossRef]
- Bolandi, H.; Sreekumar, G.; Li, X.; Lajnef, N.; Boddeti, V.N. Physics informed neural network for dynamic stress prediction. Appl. Intell. 2023, 53, 26313–26328. [Google Scholar] [CrossRef]
- Zhu, Q.; Liu, Z.; Yan, J. Machine learning for metal additive manufacturing: Predicting temperature and melt pool fluid dynamics using physics-informed neural networks. Comput. Mech. 2021, 67, 619–635. [Google Scholar] [CrossRef]
- Chaffart, D.; Yuan, Y.; Ricardez-Sandoval, L.A. Multiscale Physics-Informed Neural Network Framework to Capture Stochastic Thin-Film Deposition. J. Phys. Chem. C 2024, 128, 3733–3750. [Google Scholar] [CrossRef]
- Desai, S.; Mattheakis, M.; Joy, H.; Protopapas, P.; Roberts, S. One-shot transfer learning of physics-informed neural networks. arXiv 2021, arXiv:2110.11286. [Google Scholar]
- Jin, H.; Mattheakis, M.; Protopapas, P. Physics-informed neural networks for quantum eigenvalue problems. In Proceedings of the IEEE 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; pp. 1–8. [Google Scholar]
- Meray, A.; Wang, L.; Kurihana, T.; Mastilovic, I.; Praveen, S.; Xu, Z.; Memarzadeh, M.; Lavin, A.; Wainwright, H. Physics-informed surrogate modeling for supporting climate resilience at groundwater contamination sites. Comput. Geosci. 2024, 183, 105508. [Google Scholar] [CrossRef]
- Waheed, U.B.; Alkhalifah, T.; Haghighat, E.; Song, C.; Virieux, J. PINNtomo: Seismic tomography using physics-informed neural networks. arXiv 2021, arXiv:2104.01588. [Google Scholar]
- Lakshminarayana, S.; Sthapit, S.; Maple, C. Application of physics-informed machine learning techniques for power grid parameter estimation. Sustainability 2022, 14, 2051. [Google Scholar] [CrossRef]
- Rodrigues, J.A. Using Physics-Informed Neural Networks (PINNs) for Tumor Cell Growth Modeling. Mathematics 2024, 12, 1195. [Google Scholar] [CrossRef]
- Raeisi, E.; Yavuz, M.; Khosravifarsani, M.; Fadaei, Y. Mathematical modeling of interactions between colon cancer and immune system with a deep learning algorithm. Eur. Phys. J. Plus 2024, 139, 1–16. [Google Scholar] [CrossRef]
- Arzani, A.; Wang, J.X.; Sacks, M.S.; Shadden, S.C. Machine learning for cardiovascular biomechanics modeling: Challenges and beyond. Ann. Biomed. Eng. 2022, 50, 615–627. [Google Scholar] [CrossRef]
- Perez-Raya, I.; Kandlikar, S.G. Thermal modeling of patient-specific breast cancer with physics-based artificial intelligence. ASME J. Heat Mass Transf. 2023, 145, 031201. [Google Scholar] [CrossRef]
- Semeraro, C.; Lezoche, M.; Panetto, H.; Dassisti, M. Digital twin paradigm: A systematic literature review. Comput. Ind. 2021, 130, 103469. [Google Scholar] [CrossRef]
- Emmert-Streib, F. Defining a digital twin: A data science-based unification. Mach. Learn. Knowl. Extr. 2023, 5, 1036–1054. [Google Scholar] [CrossRef]
Application | Reference | FS | PE | ODEs | PDEs | GEs |
---|---|---|---|---|---|---|
Fluid Dynamics | [7] | ✓ | ✓ | – | ✓ | – |
[43] | ✓ | ✓ | – | ✓ | – | |
[42] | ✓ | – | – | ✓ | – | |
[114] | – | ✓ | – | ✓ | – | |
Material Science | [132] | ✓ | – | – | – | ✓ |
[133] | ✓ | ✓ | – | – | ✓ | |
[136] | – | ✓ | – | ✓ | – | |
Structural Systems | [137] | ✓ | ✓ | – | ✓ | – |
[43] | ✓ | ✓ | – | ✓ | – | |
[138] | ✓ | – | – | ✓ | – | |
[139] | – | ✓ | – | ✓ | – | |
Quantum Mechanics | [7] | ✓ | ✓ | – | ✓ | – |
[140] | ✓ | – | ✓ | ✓ | – | |
[141] | ✓ | – | – | ✓ | – | |
Geophysics | [142] | ✓ | – | – | ✓ | – |
[143] | – | ✓ | – | ✓ | – | |
Energy Systems | [121] | ✓ | – | – | – | ✓ |
[123] | – | ✓ | – | ✓ | – | |
[144] | – | ✓ | ✓ | – | – | |
Oncology | [145] | ✓ | – | ✓ | – | – |
[134] | ✓ | – | – | – | ✓ | |
[135] | ✓ | – | – | – | ✓ | |
[146] | – | ✓ | ✓ | – | – |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Farea, A.; Yli-Harja, O.; Emmert-Streib, F. Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges. AI 2024, 5, 1534-1557. https://doi.org/10.3390/ai5030074
Farea A, Yli-Harja O, Emmert-Streib F. Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges. AI. 2024; 5(3):1534-1557. https://doi.org/10.3390/ai5030074
Chicago/Turabian StyleFarea, Amer, Olli Yli-Harja, and Frank Emmert-Streib. 2024. "Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges" AI 5, no. 3: 1534-1557. https://doi.org/10.3390/ai5030074
APA StyleFarea, A., Yli-Harja, O., & Emmert-Streib, F. (2024). Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges. AI, 5(3), 1534-1557. https://doi.org/10.3390/ai5030074