Tutorials: Open in Colab
Documentation: https://reservoirpy.readthedocs.io/
Tip
π Exciting News! We just launched a new beta tool based on a Large Language Model! π You can chat with ReservoirChat and ask anything about Reservoir Computing and ReservoirPy! π€π‘ Donβt miss out, itβs available for a limited time! β³
Feature overview:
- easy creation of complex architectures with multiple reservoirs (e.g. deep reservoirs), readouts
- feedback loops
- offline and online training
- parallel implementation
- sparse matrix computation
- advanced learning rules (e.g. Intrinsic Plasticity, Local Plasticity or NVAR (Next-Generation RC))
- interfacing with scikit-learn models
- and many more!
Moreover, graphical tools are included to easily explore hyperparameters
with the help of the hyperopt library.
pip install reservoirpy
For a general introduction to reservoir computing and ReservoirPy features, take a look at the tutorials
from reservoirpy.datasets import mackey_glass, to_forecasting
from reservoirpy.nodes import Reservoir, Ridge
from reservoirpy.observables import rmse, rsquare
### Step 1: Load the dataset
X = mackey_glass(n_timesteps=2000) # (2000, 1)-shaped array
# create y by shifting X, and train/test split
x_train, x_test, y_train, y_test = to_forecasting(X, test_size=0.2)
### Step 2: Create an Echo State Network
# 100 neurons reservoir, spectral radius = 1.25, leak rate = 0.3
reservoir = Reservoir(units=100, sr=1.25, lr=0.3)
# feed-forward layer of neurons, trained with L2-regularization
readout = Ridge(ridge=1e-5)
# connect the two nodes
esn = reservoir >> readout
### Step 3: Fit, run and evaluate the ESN
esn.fit(x_train, y_train, warmup=100)
predictions = esn.run(x_test)
print(f"RMSE: {rmse(y_test, predictions)}; R^2 score: {rsquare(y_test, predictions)}")
# RMSE: 0.0020282; R^2 score: 0.99992
- 1 - Getting started with ReservoirPy
- 2 - Advanced features
- 3 - General introduction to Reservoir Computing
- 4 - Understand and optimise hyperparameters
- 5 - Classification with reservoir computing
- 6 - Interfacing ReservoirPy with scikit-learn
For advanced users, we also showcase partial reproduction of papers on reservoir computing to demonstrate some features of the library.
- Improving reservoir using Intrinsic Plasticity (Schrauwen et al., 2008)
- Interactive reservoir computing for chunking information streams (Asabuki et al., 2018)
- Next-Generation reservoir computing (Gauthier et al., 2021)
- Edge of stability Echo State Network (Ceni et al., 2023)
If you want your paper to appear here, please contact us (see contact link below).
- ( HAL | PDF | Code ) Leger et al. (2024) Evolving Reservoirs for Meta Reinforcement Learning. EvoAPPS 2024
- ( arXiv | PDF ) Chaix-Eichel et al. (2022) From implicit learning to explicit representations. arXiv preprint arXiv:2204.02484.
- ( HTML | HAL | PDF ) Trouvain & Hinaut (2021) Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs. ICANN 2021
- ( HTML ) Pagliarini et al. (2021) Canary Vocal Sensorimotor Model with RNN Decoder and Low-dimensional GAN Generator. ICDL 2021.
- ( HAL | PDF ) Pagliarini et al. (2021) What does the Canary Say? Low-Dimensional GAN Applied to Birdsong. HAL preprint.
- ( HTML | HAL | PDF ) Hinaut & Trouvain (2021) Which Hype for My New Task? Hints and Random Search for Echo State Networks Hyperparameters. ICANN 2021
We also provide a curated list of tutorials, papers, projects and tools for Reservoir Computing (not necessarily related to ReservoirPy) here!:
https://github.com/reservoirpy/awesome-reservoir-computing
If you have a question regarding the library, please open an issue.
If you have more general question or feedback you can contact us by email to xavier dot hinaut the-famous-home-symbol inria dot fr.
Trouvain, N., Pedrelli, L., Dinh, T. T., Hinaut, X. (2020) ReservoirPy: an efficient and user-friendly library to design echo state networks. In International Conference on Artificial Neural Networks (pp. 494-505). Springer, Cham. ( HTML | HAL | PDF )
If you're using ReservoirPy in your work, please cite our package using the following bibtex entry:
@incollection{Trouvain2020,
doi = {10.1007/978-3-030-61616-8_40},
url = {https://doi.org/10.1007/978-3-030-61616-8_40},
year = {2020},
publisher = {Springer International Publishing},
pages = {494--505},
author = {Nathan Trouvain and Luca Pedrelli and Thanh Trung Dinh and Xavier Hinaut},
title = {{ReservoirPy}: An Efficient and User-Friendly Library to Design Echo State Networks},
booktitle = {Artificial Neural Networks and Machine Learning {\textendash} {ICANN} 2020}
}
This package is developed and supported by Inria at Bordeaux, France in Mnemosyne group. Inria is a French Research Institute in Digital Sciences (Computer Science, Mathematics, Robotics, ...).