This package implements structure-preserving neural networks for learning dynamics of differential systems from data.
Install it using pip: pip install strupnet
This package implements the symplectic neural networks found in [1] ("G" and "LA"-SympNets) and [2] ("H"-SympNets) as well as some new ones [3] ("P", "R" and "GR"-SympNets).
import torch
from strupnet import SympNet
dim = 2 # degrees of freedom for the Hamiltonian system. x = (p, q) \in R^{2*dim}
sympnet = SympNet(dim=dim, layers=12, width=8)
timestep = torch.tensor([0.1])
x0 = torch.randn(2 * dim) # phase space coordinate x0 = (p0, q0)
x1 = sympnet(x0, timestep) # defines a random but symplectic transformation from x0 to x1
The rest of your code is identical to you how you would train any module that inherits from torch.nn.Module
.
This module implements neural networks with unit Jacobian determinant. The VolNet
is constructed from compositions of SympNets
, and therefore requires you to pass through arguments that define one of the above SympNets
. See the below example on how it's initialised.
import torch
from strupnet import VolNet
dim = 3 # dimension of the ODE
p_sympnet_kwargs = dict(
method="P",
layers=6,
max_degree=4, # used for method='P' only, method='R' requires you to specify width.
)
volnet = VolNet(dim=DIM, **p_sympnet_kwargs)
timestep = torch.tensor([0.1]) # time-step
x0 = torch.randn(3)
x1 = volnet(x0, timestep) # defines a random but volume-preserving neural network mapping from x0 to x1
The rest of your code is identical to you how you would train any module that inherits from torch.nn.Module
.
See the examples/
folder for notebooks on basic implementation of SympNet
and VolNet
[1] Jin, P., Zhang, Z., Zhu, A., Tang, Y. and Karniadakis, G.E., 2020. SympNets: Intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems. Neural Networks, 132, pp.166-179.
[2] Burby, J.W., Tang, Q. and Maulik, R., 2020. Fast neural Poincaré maps for toroidal magnetic fields. Plasma Physics and Controlled Fusion, 63(2), p.024001.
[3] In press.