- LwF (ECCV 2016): Learning without Forgetting.
- EWC (PNAS 2017): Overcoming catastrophic forgetting in neural networks.
- iCaRL (CVPR 2017): Incremental Classifier and Representation Learning.
- BiC (CVPR 2019): Large Scale Incremental Learning.
- LUCIR (CVPR 2019): Learning a Unified Classifier Incrementally via Rebalancing.
- WA (CVPR 2020): Maintaining Discrimination and Fairness in Class Incremental Learning.
- OCM (ICML 2022): Online Continual Learning through Mutual Information Maximization.
- ERACE, ERAML (ICLR 2022): New Insights on reducing abrupt representation change in online continual learning.
- GPM (ICLR 2021): Gradient Projection Memory for Continual Learning.
- TRGP (ICLR 2022): Trust Region Gradient Projection for Continual Learning.
- API (CVPR 2023): Adaptive Plasticity Improvement for Continual Learning.
- RanPAC (NeurIPS 2023): Random Projections and Pre-trained Models for Continual Learning.
- L2P (CVPR 2022): Learning to Prompt for Continual Learning.
- DualPrompt (ECCV 2022): Complementary Prompting for Rehearsal-free Continual Learning.
- CodaPrompt (CVPR 2023): COntinual Decomposed Attention-based Prompting for Rehearsal-Free Continual Learning.
- InfLoRA (CVPR 2024): Interference-Free Low-Rank Adaptation for Continual Learning.
- MoE_Adapter4CL (CVPR 2024): Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters.
- RAPF (ECCV 2024): Class-Incremental Learning with CLIP: Adaptive Representation Adjustment and Parameter Fusion.
- SD_LoRA (ICLR 2025): Scalable Decoupled Low-Rank Adaptation for Class Incremental Learning
Please refer to install.md
Complete tutorials can be found at ./docs
- CIFAR-10 is avaliable at Google Drive
- CIFAR-100 is available at Google Drive
- CUB200, ImageNet-R, Tiny-Imagenet are available at Google Drive
After the dataset is downloaded, please extract the compressed file to the specified path.
unzip cifar100.zip -d /path/to/your/dataset
Set the data_root
in .yaml
:
data_root: /path/to/your/dataset
To add a custom dataset, please refer to dataset.md
.
Once you have completed the "Installation" and "Datasets" sections, you can now proceed to demonstrate how to use the "LibContinual" framework with the LUCIR
method.
- Step1: Configure the parameters in the
./config/lucir.yaml
file. Please refer toconfig.md
for the meanings of each parameter. - Step2: Run code
python run_trainer.py --config lucir.yaml
- Step3: After the training is completed, the log files will be saved in the path specified by the
save_path
parameter.
We adopt standardized evaluation metrics from continual learning literature. Given T tasks where
Equivalent to Positive BwT in "new metrics for Continual Learning"
LibContinual is an open source project designed to help continual learning researchers quickly understand the classic methods and code structures. We welcome other contributors to use this framework to implement their own or other impressive methods and add them to LibContinual. This library can only be used for academic research. We welcome any feedback during using LibContinual and will try our best to continually improve the library. Special thanks to the authors of FACIL and PyCIL for their inspiration on framework design.
We have referenced useful modules from these repositories in our work. We deeply appreciate the authors of these repositories.
This project is licensed under the MIT License. See LICENSE for more details.