This repository contains training algorithm and MOM6 ocean model with implemented parameterization. Additionally, we include figure plotting notebooks for the paper "Generalizable neural-network parameterization of mesoscale eddies in idealized and global ocean models" by Pavel Perezhogin, Laure Zanna and Alistair Adcroft, to be submitted soon.
Example of online simulation with the proposed ANN parameterization in the idealized configuration NeverWorld2 of the MOM6 ocean model (full movie, code):
Online simulation in global ocean model OM4 (full movie, code):
In folder notebooks we show Jupyter notebooks for each Figure plotted in the paper.
Training data, offline and online skill can be found on Zenodo.
git clone --recursive git@github.com:m2lines/ANN-momentum-mesoscale.git
- The submodule src/MOM6 contains the ocean model source code with implemented ANN parameterization. No additional software is required. See src/README.md to visualize modifications to MOM6 source code.
- Weights of trained ANN parameterization used for online simulations are in CM26_ML_models/ocean3d/subfilter/FGR3/hidden-layer-20/seed-default/model/Tall.nc. Additional seeds and ANN with more neurons are provided in CM26_ML_models/ocean3d/subfilter/FGR3/
- The files required to run online experiments in idealized and global ocean configurations are provided in folders configurations/Neverworld2 and configurations/OM4, respectively
The ANN parameterization with local dimensional scaling is executed in Python code in training-on-CM2.6/helpers/state_functions.py.
- Download training dataset from Zenodo.
wget https://zenodo.org/records/15328410/files/factor-15.zip
wget https://zenodo.org/records/15328410/files/factor-12.zip
wget https://zenodo.org/records/15328410/files/factor-9.zip
wget https://zenodo.org/records/15328410/files/factor-4.zip
- Unpack dataset:
unzip factor-15.zip
unzip factor-12.zip
unzip factor-9.zip
unzip factor-4.zip
- Provide path to the training data in cm26.py
- Provide path where to save trained ANNs in train_script.py
Training loop is executed on CPUs via the following script and takes ~9 hours on 4CPU cores and 64GB memory:
cd src/training-on-CM2.6/scripts/
python train_script.py
This script contains the default hyperparameters used in the paper. The skill on the testing dataset will be available in {path_save}/skill-test/factor-{factor}.nc
and log of training/validation losses in {path_save}/model/logger.nc
.
In case you want to recreate training data yourself instead of downloading it from Zenodo, here are the instructions.
- Set where to save raw data in download_raw_data.py
- Run the script:
cd src/training-on-CM2.6/scripts/
python download_raw_data.py
- Provide path to
rawdata
in training-on-CM2.6/helpers/cm26.py - Provide the path where to save coarsened data in script generate_3d_datasets.py
- Dataset for each coarsegraining factor (
4,9,12,15
) is generated with the script:
cd src/training-on-CM2.6/scripts/
python generate_3d_datasets.py --factor=4
- Provide the path to newly generated coarse datasets in cm26.py
Filtered dataset with diagnosed subfilter fluxes in idealized configuration NW2 is constructed using scripts filter-NW2-data.py and filter-interfaces-GM-filter.py. Second script is optional and used only to more accurately estimate APE in outcropping regions. Offline prediction of subfilter fluxes and evaluation of offline skill is present in notebook offline-analysis-NW2.ipynb.