A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning
<p>4D-Var assimilation in the NWP.</p> "> Figure 2
<p>The architecture of the forecast model BNN.</p> "> Figure 3
<p>Schematic diagram of the ML-4DVAR.</p> "> Figure 4
<p>Comparison of the BNN and the CNN simulation effects.</p> "> Figure 5
<p>The temporal and spatial distribution of the output values of the BNN and the Lorenz-96 model under the same initial conditions, (<b>a</b>) the output of the BNN, (<b>b</b>) the output of the Lorenz-96 model, and (<b>c</b>) the difference between the two models.</p> "> Figure 6
<p>The value of RMSE under different <math display="inline"><semantics> <mi>α</mi> </semantics></math>.</p> "> Figure 7
<p>RMSE, <math display="inline"><semantics> <msup> <mi mathvariant="normal">R</mi> <mn>2</mn> </msup> </semantics></math>, and NSE of Joint-4DVAR and Original-4DVAR. (<b>a</b>) RMSE, (<b>b</b>) <math display="inline"><semantics> <msup> <mi mathvariant="normal">R</mi> <mn>2</mn> </msup> </semantics></math>, (<b>c</b>) NSE.</p> "> Figure 8
<p>RMSE, <math display="inline"><semantics> <msup> <mi mathvariant="normal">R</mi> <mn>2</mn> </msup> </semantics></math>, and NSE of ML-4DVAR and Original-4DVAR at analysis time. (<b>a</b>) RMSE, (<b>b</b>) <math display="inline"><semantics> <msup> <mi mathvariant="normal">R</mi> <mn>2</mn> </msup> </semantics></math>, (<b>c</b>) NSE.</p> "> Figure 9
<p>(<b>a</b>) RMSE, (<b>b</b>) <math display="inline"><semantics> <msup> <mi mathvariant="normal">R</mi> <mn>2</mn> </msup> </semantics></math>, and (<b>c</b>) NSE of <math display="inline"><semantics> <msub> <mi>ML</mi> <mi mathvariant="normal">O</mi> </msub> </semantics></math>-4DVAR and Original-4DVAR.</p> ">
Abstract
:1. Introduction
2. Methods
2.1. Related Knowledge
2.2. Problem Statement
2.3. The Architecture of ML-4DVAR
- ①
- At the start time i of the assimilation time window, the previous forecast is regarded as the initial field. After the initial field is gained, the NN model forecasts until the end time i + 1 of the assimilation time window. The forecast obtained in this step is called the background forecast.
- ②
- The cost function is computed. The cost function is the sum of the model observation equivalents and the observations difference in the assimilation time window. The model observation equivalents are the output of the observation operator acting on the background forecast.
- ③
- The gradient of the cost function with respect to the control variable is calculated, and the calculation of the gradient requires the help of the tangent linear and adjoint models of NN.
- ④
- We use an appropriate optimization algorithm to estimate the correction value of the state variable.
- ⑤
- Return to ①; the following optimization cycle is started and runs until it meets the accuracy requirements and stops, and is output.
- ⑥
- The forecast field at time i + 1 is calculated, the initial field is , and the forecast model is NN, and then, the next analysis cycle begins.
3. Experiments and Results
3.1. Lorenz-96 Model
3.2. Performance of the Neural Network Forecast Model
3.3. The Cost Function Settings
3.4. Evaluation
- The root mean square error (RMSE) is the square root of the ratio of the square of the difference between the two datasets to the number of observations [42]. RMSE signifies the total error between the two datasets. The overall errors are the constitution of two errors: the first part of errors are systematic errors, and the second part of errors are unsystematic errors. The value range of RMSE is . The closer the RMSE is to 0, the smaller the difference between the two datasets. The definition of RMSE is shown in Equation (10).
- Determinable coefficient () is a statistic that measures the goodness of fit [43]. is the ratio of the covariance of the two datasets to the standard deviation of the two datasets. The value range of is . The closer is to 1, the stronger the correlation between the two datasets. The definition of is shown in Equation (11).
- Nash–Sutcliffe model efficiency (NSE) is often employed to quantify the prediction accuracy of simulation models (such as hydrological models). It can be used to express the accuracy of model output results [44]. NSE is obtained by subtracting the mean squared error of the target dataset and the standard dataset to the variance of the standard dataset from one. The value range of NSE is . The closer the NSE value is to 1, the better the predictive ability of the model and the higher the consistency between the target dataset and the standard dataset. Its definition is shown in Equation (12).
3.5. 4D-Var Experiments
3.5.1. The Joint-4DVAR
3.5.2. The ML-4DVAR
3.5.3. The MLO-4DVAR
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Bauer, P.; Thorpe, A.; Brunet, G. The quiet revolution of numerical weather prediction. Nature 2015, 525, 47–55. [Google Scholar] [CrossRef] [PubMed]
- Haltiner, G.J.; Williams, R.T. Numerical Prediction and Dynamic Meteorology; Technical Report; Wiley: Hoboken, NJ, USA, 1980. [Google Scholar]
- Bjerknes, V. Das problem der wettervorhersage, betrachtet vom standpunkte der mechanik und der physik, translation by y. mintz: The problem of weather forecasting as a problem in mechanics and physics. los angeles, 1954. Meteor. Zeit 1904, 21, 1–7. [Google Scholar]
- Kalnay, E. Atmospheric Modeling, Data Assimilation and Predictability; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Rabier, F.; Järvinen, H.; Klinker, E.; Mahfouf, J.F.; Simmons, A. The ECMWF operational implementation of four-dimensional variational assimilation. I: Experimental results with simplified physics. Q. J. R. Meteorol. Soc. 2000, 126, 1143–1170. [Google Scholar] [CrossRef]
- Gauthier, P.; Thepaut, J.N. Impact of the digital filter as a weak constraint in the preoperational 4DVAR assimilation system of Météo-France. Mon. Weather Rev. 2001, 129, 2089–2102. [Google Scholar] [CrossRef] [Green Version]
- Rawlins, F.; Ballard, S.; Bovis, K.; Clayton, A.; Li, D.; Inverarity, G.; Lorenc, A.; Payne, T. The Met Office global four-dimensional variational data assimilation scheme. Q. J. R. Meteorol. Soc. A J. Atmos. Sci. Appl. Meteorol. Phys. Oceanogr. 2007, 133, 347–362. [Google Scholar] [CrossRef]
- Gauthier, P.; Tanguay, M.; Laroche, S.; Pellerin, S.; Morneau, J. Extension of 3DVAR to 4DVAR: Implementation of 4DVAR at the Meteorological Service of Canada. Mon. Weather Rev. 2007, 135, 2339–2354. [Google Scholar] [CrossRef] [Green Version]
- Honda, Y.; Nishijima, M.; Koizumi, K.; Ohta, Y.; Tamiya, K.; Kawabata, T.; Tsuyuki, T. A pre-operational variational data assimilation system for a non-hydrostatic model at the Japan Meteorological Agency: Formulation and preliminary results. Q. J. R. Meteorol. Soc. A J. Atmos. Sci. Appl. Meteorol. Phys. Oceanogr. 2005, 131, 3465–3475. [Google Scholar] [CrossRef]
- Shen, X.; Wang, J.; Li, Z.; Chen, D.; Gong, J. Research and Operational Development of Numerical Weather Prediction in China. J. Meteorol. Res. 2020, 34, 675–698. [Google Scholar] [CrossRef]
- Courtier, P.; Thépaut, J.N.; Hollingsworth, A. A strategy for operational implementation of 4D-Var, using an incremental approach. Q. J. R. Meteorol. Soc. 1994, 120, 1367–1387. [Google Scholar] [CrossRef]
- Jiandong, G. Data Assimilation: A Key Technology for NWP—Technical Review of Data Assimilation in ECMWF. Adv. Meteorol. Sci. Technol. 2013, 3, 6–13. [Google Scholar]
- Bannister, R. A review of operational methods of variational and ensemble-variational data assimilation. Q. J. R. Meteorol. Soc. 2017, 143, 607–633. [Google Scholar] [CrossRef] [Green Version]
- Houtekamer, P.L.; Zhang, F. Review of the ensemble Kalman filter for atmospheric data assimilation. Mon. Weather Rev. 2016, 144, 4489–4532. [Google Scholar] [CrossRef]
- Geer, A. Learning earth system models from observations: Machine learning or data assimilation? Philos. Trans. R. Soc. A 2021, 379, 20200089. [Google Scholar] [CrossRef] [PubMed]
- Dueben, P.; Modigliani, U.; Geer, A.; Siemen, S.; Pappenberger, F.; Bauer, P.; Brown, A.; Palkovič, M.; Raoult, B.; Wedi, N.; et al. Technical Memo; European Centre for Medium-Range Weather Forecasts: Reading, UK, 2021. [Google Scholar]
- Schultz, M.; Betancourt, C.; Gong, B.; Kleinert, F.; Langguth, M.; Leufen, L.; Mozaffari, A.; Stadtler, S. Can deep learning beat numerical weather prediction? Philos. Trans. R. Soc. A 2021, 379, 20200097. [Google Scholar] [CrossRef] [PubMed]
- Reichstein, M.; Camps-Valls, G.; Stevens, B.; Jung, M.; Denzler, J.; Carvalhais, N. Deep learning and process understanding for data-driven Earth system science. Nature 2019, 566, 195–204. [Google Scholar] [CrossRef]
- Camps-Valls, G.; Reichstein, M.; Xiaoxiang, Z.; Tuia, D. Deep Learning for Earth Sciences; Wiley: Hoboken, NJ, USA, 2021. [Google Scholar]
- Weyn, J.A.; Durran, D.R.; Caruana, R. Improving data-driven global weather prediction using deep convolutional neural networks on a cubed sphere. J. Adv. Model. Earth Syst. 2020, 12, e2020MS002109. [Google Scholar] [CrossRef]
- Brenowitz, N.D.; Bretherton, C.S. Prognostic validation of a neural network unified physics parameterization. Geophys. Res. Lett. 2018, 45, 6289–6298. [Google Scholar] [CrossRef]
- Rasp, S.; Pritchard, M.S.; Gentine, P. Deep learning to represent subgrid processes in climate models. Proc. Natl. Acad. Sci. USA 2018, 115, 9684–9689. [Google Scholar] [CrossRef] [Green Version]
- Yuval, J.; O’Gorman, P.A.; Hill, C.N. Use of neural networks for stable, accurate and physically consistent parameterization of subgrid atmospheric processes with good performance at reduced precision. Geophys. Res. Lett. 2021, 48, e2020GL091363. [Google Scholar] [CrossRef]
- Song, H.J.; Roh, S. Improved weather forecasting using neural network emulation for radiation parameterization. J. Adv. Model. Earth Syst. 2021, 13, e2021MS002609. [Google Scholar] [CrossRef]
- Krasnopolsky, V. Using machine learning for model physics: An overview. arXiv 2020, arXiv:2002.00416. [Google Scholar]
- Chantry, M.; Hatfield, S.; Dueben, P.; Polichtchouk, I.; Palmer, T. Machine learning emulation of gravity wave drag in numerical weather forecasting. J. Adv. Model. Earth Syst. 2021, 13, e2021MS002477. [Google Scholar] [CrossRef] [PubMed]
- Bonavita, M.; Laloyaux, P. Machine learning for model error inference and correction. J. Adv. Model. Earth Syst. 2020, 12, e2020MS002232. [Google Scholar] [CrossRef]
- Hatfield, S.; Chantry, M.; Dueben, P.; Lopez, P.; Geer, A.; Palmer, T. Building Tangent-Linear and Adjoint Models for Data Assimilation With Neural Networks. J. Adv. Model. Earth Syst. 2021, 13, e2021MS002521. [Google Scholar] [CrossRef]
- Nonnenmacher, M.; Greenberg, D.S. Deep Emulators for Differentiation, Forecasting and Parametrization in Earth Science Simulators. J. Adv. Model. Earth Syst. 2021, 13, e2021MS002554. [Google Scholar] [CrossRef]
- Vapnik, V.N. Complete statistical theory of learning. Autom. Remote Control 2019, 80, 1949–1975. [Google Scholar] [CrossRef]
- Nielsen, M. A Visual Proof That Neural Nets Can Compute Any Function. 2016. Available online: http://neuralnetworksanddeeplearning.com/chap4.html (accessed on 17 January 2022).
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Fablet, R.; Ouala, S.; Herzet, C. Bilinear residual neural network for the identification and forecasting of geophysical dynamics. In Proceedings of the 2018 IEEE 26th European Signal Processing Conference (Eusipco), Rome, Italy, 3–7 September 2018; pp. 1477–1481. [Google Scholar]
- Gentine, P.; Pritchard, M.; Rasp, S.; Reinaudi, G.; Yacalis, G. Could machine learning break the convection parameterization deadlock? Geophys. Res. Lett. 2018, 45, 5742–5751. [Google Scholar] [CrossRef]
- Lewis, J.M.; Lakshmivarahan, S.; Dhall, S. Dynamic Data Assimilation: A Least Squares Approach; Cambridge University Press: Cambridge, UK, 2006; Volume 13. [Google Scholar]
- Lorenz, E.N.; Emanuel, K.A. Optimal sites for supplementary weather observations: Simulation with a small model. J. Atmos. Sci. 1998, 55, 399–414. [Google Scholar] [CrossRef]
- Lorenz, E.N. Predictability: A problem partly solved. In Proceedings of the Seminar on Predictability, Reading, UK, 4–8 September 1995; Volume 1. [Google Scholar]
- Anderson, J.L. An ensemble adjustment Kalman filter for data assimilation. Mon. Weather Rev. 2001, 129, 2884–2903. [Google Scholar] [CrossRef] [Green Version]
- Kotsuki, S.; Greybush, S.J.; Miyoshi, T. Can we optimize the assimilation order in the serial ensemble Kalman filter? A study with the Lorenz-96 model. Mon. Weather Rev. 2017, 145, 4977–4995. [Google Scholar] [CrossRef]
- Nishizawa, S. 4D-Var data assimilation using an adjoint model of a neural network surrogate model. Earth Space Sci. Open Arch. ESSOAr 2021. [Google Scholar] [CrossRef]
- Ji, L.; Senay, G.B.; Verdin, J.P. Evaluation of the Global Land Data Assimilation System (GLDAS) air temperature data products. J. Hydrometeorol. 2015, 16, 2463–2480. [Google Scholar] [CrossRef]
- Hyndman, R.J.; Koehler, A.B. Another look at measures of forecast accuracy. Int. J. Forecast. 2006, 22, 679–688. [Google Scholar] [CrossRef] [Green Version]
- Peng, K.; Cao, X.; Liu, B.; Guo, Y.; Xiao, C.; Tian, W. Polar Vortex Multi-Day Intensity Prediction Relying on New Deep Learning Model: A Combined Convolution Neural Network with Long Short-Term Memory Based on Gaussian Smoothing Method. Entropy 2021, 23, 1314. [Google Scholar] [CrossRef]
- Nash, J.E.; Sutcliffe, J.V. River flow forecasting through conceptual models part I—A discussion of principles. J. Hydrol. 1970, 10, 282–290. [Google Scholar] [CrossRef]
RMSE | NSE | ||
---|---|---|---|
Joint-4DVAR | 0.171965 | 0.997868 | 0.997706 |
Original-4DVAR | 0.306383 | 0.993088 | 0.992716 |
RMSE | NSE | ||
---|---|---|---|
Joint-4DVAR | 0.181307 | 0.997697 | 0.997452 |
Original-4DVAR | 0.300698 | 0.993383 | 0.992985 |
Time | |
---|---|
Joint-4DVAR | 248.116567 |
Original-4DVAR | 727.291506 |
RMSE | NSE | Time (Unit: s) | |||
---|---|---|---|---|---|
ML-4DVAR | 0.169947 | 0.997871 | 0.997760 | 158.181050 | |
Original-4DVAR | 0.306383 | 0.993088 | 0.992716 | 727.291506 | |
ML-4DVAR | 0.175781 | 0.997716 | 0.997605 | 158.181050 | |
Original-4DVAR | 0.300698 | 0.993383 | 0.992985 | 727.291506 |
RMSE | NSE | Time (Unit: s) | |||
---|---|---|---|---|---|
ML-4DVAR | 0.169947 | 0.997871 | 0.997760 | 158.181050 | |
Joint-4DVAR | 0.171965 | 0.997868 | 0.997706 | 248.116567 | |
ML-4DVAR | 0.175781 | 0.997716 | 0.997605 | 158.181050 | |
Joint-4DVAR | 0.181307 | 0.997697 | 0.997452 | 248.116567 |
RMSE | NSE | Time (Unit: s) | |||
---|---|---|---|---|---|
-4DVAR | 0.213248 | 0.996640 | 0.996481 | 173.746833 | |
Original-4DVAR | 0.306383 | 0.993088 | 0.992716 | 727.291506 | |
-4DVAR | 0.285967 | 0.993881 | 0.993653 | 173.746833 | |
Original-4DVAR | 0.300698 | 0.993383 | 0.992985 | 727.291506 |
RMSE | NSE | Time (Unit: s) | |||
---|---|---|---|---|---|
-4DVAR | 0.213248 | 0.996640 | 0.996481 | 173.746833 | |
ML-4DVAR | 0.169947 | 0.997871 | 0.997760 | 158.181050 | |
-4DVAR | 0.285967 | 0.993881 | 0.993653 | 173.746833 | |
ML-4DVAR | 0.175781 | 0.997716 | 0.997605 | 158.181050 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dong, R.; Leng, H.; Zhao, J.; Song, J.; Liang, S. A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning. Entropy 2022, 24, 264. https://doi.org/10.3390/e24020264
Dong R, Leng H, Zhao J, Song J, Liang S. A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning. Entropy. 2022; 24(2):264. https://doi.org/10.3390/e24020264
Chicago/Turabian StyleDong, Renze, Hongze Leng, Juan Zhao, Junqiang Song, and Shutian Liang. 2022. "A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning" Entropy 24, no. 2: 264. https://doi.org/10.3390/e24020264
APA StyleDong, R., Leng, H., Zhao, J., Song, J., & Liang, S. (2022). A Framework for Four-Dimensional Variational Data Assimilation Based on Machine Learning. Entropy, 24(2), 264. https://doi.org/10.3390/e24020264