[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Curriculum learning for data-driven modeling of dynamical systems

  • Regular Article - Flowing Matter
  • Published:
The European Physical Journal E Aims and scope Submit manuscript

Abstract

The reliable prediction of the temporal behavior of complex systems is key in numerous scientific fields. This strong interest is however hindered by modeling issues: Often, the governing equations describing the physics of the system under consideration are not accessible or, when known, their solution might require a computational time incompatible with the prediction time constraints. Not surprisingly, approximating complex systems in a generic functional format and informing it ex–nihilo from available observations has become common practice in the age of machine learning, as illustrated by the numerous successful examples based on deep neural networks. However, generalizability of the models, margins of guarantee and the impact of data are often overlooked or examined mainly by relying on prior knowledge of the physics. We tackle these issues from a different viewpoint, by adopting a curriculum learning strategy. In curriculum learning, the dataset is structured such that the training process starts from simple samples toward more complex ones in order to favor convergence and generalization. The concept has been developed and successfully applied in robotics and control of systems. Here, we apply this concept for the learning of complex dynamical systems in a systematic way. First, leveraging insights from the ergodic theory, we assess the amount of data sufficient for a-priori guaranteeing a faithful model of the physical system and thoroughly investigate the impact of the training set and its structure on the quality of long-term predictions. Based on that, we consider entropy as a metric of complexity of the dataset; we show how an informed design of the training set based on the analysis of the entropy significantly improves the resulting models in terms of generalizability and provide insights on the amount and the choice of data required for an effective data-driven modeling.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability statement

Data and training models used for this article will be made available on reasonable request.

References

  1. R. Agarwal, M. Schwarzer, P.S. Castro, A.C. Courville, M. Bellemare, Deep reinforcement learning at the edge of the statistical precipice. Adv. Neural. Inf. Process. Syst. 34, 29304–29320 (2021)

    Google Scholar 

  2. Y. Bengio, J. Louradour, R. Collobert, J. Weston, Curriculum learning, in Proc. International Conference on Machine Learning, Montreal, Quebec, June 14–18 (2009)

  3. G. Boffetta, M. Cencini, M. Falcioni, A. Vulpiani, Predictability: a way to characterize complexity. Phys. Rep. 356(6), 367–474 (2002)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  4. F. Borra, A. Vulpiani, M. Cencini, Effective models and predictability of chaotic multiscale systems via machine learning. Phys. Rev. E 102(5), 052203 (2020)

    Article  ADS  MathSciNet  Google Scholar 

  5. S.L. Brunton, B.R. Noack, P. Koumoutsakos, Machine learning for fluid mechanics. Annu. Rev. Fluid Mech. 52, 477–508 (2020)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  6. F. Camastra, A. Staiano, Intrinsic dimension estimation: advances and open problems. Inf. Sci. 328, 26–41 (2016)

    Article  MATH  Google Scholar 

  7. J.P. Crutchfield, B.S. McNamara, Equation of motion from a data series. Complex Syst. 1(417–452), 121 (1987)

    MathSciNet  MATH  Google Scholar 

  8. M. de Hoop, R. Baraniuk, J. Bruna, M. Campillo, H. Jasperson, S. Mallat, T. Nguyen, L. Seydoux, Unsupervised learning for identification of seismic signals, in Geophysical Research Abstracts, vol. 21 (2019)

  9. J.-P. Eckmann, D. Ruelle, Ergodic theory of chaos and strange attractors. Rev. Mod. Phys. 57, 617–656 (1985)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  10. A. Eftekhari, H.L. Yap, M.B. Wakin, C.J. Rozell, Stabilizing embedology: geometry-preserving delay-coordinate maps. Phys. Rev. E 97(2), 022222 (2018)

    Article  ADS  MathSciNet  Google Scholar 

  11. F. A. Gers, D. Eck, J. Schmidhuber, Applying lstm to time series predictable through time-window approaches, in Neural Nets WIRN Vietri-01, pp. 193–200 (2002)

  12. M.M. Ghazi, M. Nielsen, A. Pai, M. Modat, M.J. Cardoso, S. Ourselin, L. Sørensen, On the initialization of long short-term memory networks. ArXiv 10454, 2019 (1912)

    Google Scholar 

  13. J.F. Gibson, J.D. Farmer, M. Casdagli, S. Eubank, An analytic approach to practical state space reconstruction. Physica D 57(1), 1–30 (1992)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  14. R. Gilmore, J.-M. Ginoux, T. Jones, C. Letellier, U.S. Freitas, Connecting curves for dynamical systems. J. Phys. A: Math. Theor. 43(25), 255101 (2010)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  15. I. Goodfellow, Y. Bengio, A. Courville, Deep Learning (MIT Press, Cambridge, 2016)

    MATH  Google Scholar 

  16. P. Grassberger, I. Procaccia, Characterization of strange attractors. Phys. Rev. Lett. 50(5), 346 (1983)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  17. S. Hochreiter, J. Schmidhuber, Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  18. A. Jacot, F. Gabriel, C. Hongler, C. Neural tangent kernel: convergence and generalization in neural networks, in Advances in Neural Information Processing Systems, vol. 31, ed. by S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, R. Garnett (Curran Associates Inc, Red Hook, 2018)

    Google Scholar 

  19. M. Kac, Probability and Related Topics in Physical Sciences, vol. 1 (American Mathematical Soc, Providence, 1959)

    MATH  Google Scholar 

  20. H. Kantz, T. Schreiber, Nonlinear Time Series Analysis, vol. 7 (Cambridge University Press, Cambridge, 2004)

    MATH  Google Scholar 

  21. K. Kashinath, M. Mustafa, A. Albert, J.-L. Wu, C. Jiang, S. Esmaeilzadeh, K. Azizzadenesheli, R. Wang, A. Chattopadhyay, A. Singh, A. Manepalli, D. Chirila, R. Yu, R. Walters, B. White, H. Xiao, H.A. Tchelepi, P. Marcus, A. Anandkumar, P. Hassanzadeh, Prabhat, Physics-informed machine learning: case studies for weather and climate modelling. Phil. Trans. Roy. Soc. A 379, 20200093 (2021)

    Article  ADS  MathSciNet  Google Scholar 

  22. N. Kuznetsov, T. Mokaev, O. Kuznetsova, E. Kudryashova, The Lorenz system: hidden boundary of practical stability and the Lyapunov dimension. Nonlinear Dyn. 102, 713–732 (2020)

    Article  Google Scholar 

  23. W. La Cava, T. Helmuth, L. Spector, J.H. Moore, A probabilistic and multi-objective analysis of lexicase selection and \(\varepsilon \)-lexicase selection. Evol. Comput. 27(3), 377–402 (2019)

    Article  Google Scholar 

  24. E.N. Lorenz, Deterministic nonperiodic flow. J. Atmos. Sci. 20(2), 130–141 (1963)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  25. S. Narvekar, B. Peng, M. Leonetti, J. Sinapov, M. E. Taylor, P. Stone, curriculum learning for reinforcement learning domains: a framework and survey. J. Mach. Learn. Res., 21(1) (2020)

  26. G. Paladin, A. Vulpiani, Anomalous scaling laws in multifractal objects. Phys. Rep. 156(4), 147–225 (1987)

    Article  ADS  MathSciNet  Google Scholar 

  27. J. Pathak, B. Hunt, M. Girvan, Z. Lu, E. Ott, Model-free prediction of large spatiotemporally chaotic systems from data: a reservoir computing approach. Phys. Rev. Lett. 120(2), 024102 (2018)

    Article  ADS  Google Scholar 

  28. J. Pathak, Z. Lu, B.R. Hunt, M. Girvan, E. Ott, Using machine learning to replicate chaotic attractors and calculate lyapunov exponents from data. Chaos Interdiscip. J. Nonlinear Sci. 27(12), 121102 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  29. S.M. Pincus, Approximate entropy as a measure of system complexity. PNAS 88(6), 2297–2301 (1991)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  30. H. Poincaré, Les méthodes nouvelles de la mécanique céleste, volume 3. Gauthier-Villars et fils (1899)

  31. M. Quade, M. Abel, J. Nathan Kutz, S.L. Brunton, Sparse identification of nonlinear dynamics for rapid model recovery. Chaos: Interdiscip. J. Nonlinear Sci. 28(6), 063116 (2018)

    Article  MathSciNet  Google Scholar 

  32. M. Quade, M. Abel, K. Shafi, R.K. Niven, B.R. Noack, Prediction of dynamical systems by symbolic regression. Phys. Rev. E 94(1), 012214 (2016)

    Article  ADS  MathSciNet  Google Scholar 

  33. M. Raissi, P. Perdikaris, G. Karniadakis, Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378, 686–707 (2019)

  34. T.D. Sanger, Neural network learning control of robot manipulators using gradually increasing task difficulty. IEEE Trans. Robot. Autom. 10, 323–333 (1994)

    Article  Google Scholar 

  35. T. Sauer, J.A. Yorke, M. Casdagli, Embedology. J. Stat. Phys. 65(3–4), 579–616 (1991)

    Article  ADS  MathSciNet  MATH  Google Scholar 

  36. M. Schmidt, H. Lipson, Distilling free-form natural laws from experimental data. Science 324(5923), 81–85 (2009)

    Article  ADS  Google Scholar 

  37. P. Soviany, R.T. Ionescu, P. Rota, N. Sebe, Curriculum learning: a survey. Int. J. Comput. Vis. 130(6), 1526–1565 (2022)

    Article  Google Scholar 

  38. F. Takens. Detecting strange attractors in turbulence. In D. Rand, L.-S. Young, editors, Dynamical Systems and Turbulence, Warwick 1980, pp. 366–381, Berlin, 1981. Springer, Berlin

  39. L. Van der Maaten, G. Hinton, Visualizing data using t-sne. J. Mach. Learn. Res., 9(11) (2008)

  40. R. Varshavsky, A. Gottlieb, M. Linial, D. Horn, Novel unsupervised feature filtering of biological data. Bioinformatics 22(14), e507–e513 (2006)

    Article  Google Scholar 

  41. P.R. Vlachas, J. Pathak, B.R. Hunt, T.P. Sapsis, M. Girvan, E. Ott, P. Koumoutsakos, Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics. Neural Netw. 126, 191–217 (2020)

    Article  Google Scholar 

  42. L. von Rueden, S. Mayer, K. Beckh, B. Georgiev, S. Giesselbach, R. Heese, B. Kirsch, J. Pfrommer, A. Pick, R. Ramamurthy, M. Walczak, J. Garcke, C. Bauckhage, J. Schuecker, Informed machine learning—a taxonomy and survey of integrating knowledge into learning systems. IEEE Trans. Knowl. Data Eng. (2021). Accepted

  43. H. Voss, M. Bünner, M. Abel, Identification of continuous, spatiotemporal systems. Phys. Rev. E 57(3), 2820 (1998)

    Article  ADS  Google Scholar 

  44. D. Weinshall, G. Cohen, D. Amir, Curriculum learning by transfer learning: theory and experiments with deep networks, in Proceedings of the 35th International Conference on Machine Learning, pp. 5235–5243. PMLR (2018)

  45. H. Whitney, Differentiable manifolds. Ann. Math. 37(3), 645–680 (1936)

    Article  MathSciNet  MATH  Google Scholar 

  46. Y. Zhu, N. Zabaras, P.-S. Koutsourelakis, P. Perdikaris, Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data. J. Comput. Phys. 394, 56–81 (2019)

    Article  ADS  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was funded by the French Agence Nationale de la Recherche via the Flowcon project (ANR-17-ASTR-0022) and the Speed project (ANR-20-CE23-0025-01). L.M. gratefully acknowledges stimulating discussions with Alex Goro-detsky (University of Michigan, US). O.S. thanks Luca de Cicco (Politecnico di Bari, Italy) for exchanges on the role of entropy metrics in curriculum learning. S.C. gratefully acknowledge fruitful discussions with Angelo Vulpiani (University La Sapienza, Italy).

Author information

Authors and Affiliations

Authors

Contributions

MAB contributed to conceptualization, data curation, investigation, methodology, writing. OS contributed to conceptualization, investigation, methodology, writing. AA contributed to methodology, funding acquisition, writing (review and editing). SC contributed to methodology, funding acquisition, writing. LM contributed to investigation, methodology, funding acquisition, writing.

Corresponding author

Correspondence to Onofrio Semeraro.

Additional information

Quantitative AI in Complex Fluids and Complex Flows: Challenges and Benchmarks. Guest editors: Luca Biferale, Michele Buzzicotti, Massimo Cencini.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bucci, M.A., Semeraro, O., Allauzen, A. et al. Curriculum learning for data-driven modeling of dynamical systems. Eur. Phys. J. E 46, 12 (2023). https://doi.org/10.1140/epje/s10189-023-00269-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1140/epje/s10189-023-00269-8

Navigation