8000 GitHub - Wuziyi123/SRII: Class Incremental Learning
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Wuziyi123/SRII

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

84 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Class-Incremental Learning

LICENSE Python PyTorch


Incremental-Learning-Benchmark

Evaluate class incremental learning tasks shifting with popular continual learning algorithms.

The benchmarks come from the following contributions:

  • A-GEM: paper (Efficient lifelong learning with A-GEM)
  • EMR: paper (On Tiny Episodic Memories in Continual Learning)
  • iCaRL: paper (icarl: Incremental classifier and representation learning)
  • LUCIR: paper (Learning a Unified Classifier Incrementally via Rebalancing)
  • LwF: paper (Learning without forgetting)
  • EWC: paper (Overcoming catastrophic forgetting in neural networks)
  • ABD: paper (Always be dreaming: A new approach for data-free class-incremental learning)
  • SCR: paper (Supervised contrastive replay: Revisiting the nearest class mean classifier in online class-incremental continual learning)
  • S&B: paper (Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network)
  • E2E: paper (End-to-end incremental learning)
  • Bic: paper (Large scale incremental learning)

Pre-trained

Installation

Requirements

To install requirements:

python == 3.6
pytorch == 1.8.1
torch == 1.7.0
torchvision >= 0.8
numpy == 1.19.5
matplotlib == 3.3.4
opencv-python == 4.5.1.48

Setup

conda activate <env_name>
conda env create -f environment.yml -p <anaconda>/envs/<env_name>

📋 Describe how to set up the environment, e.g. pip/conda/docker commands, download datasets, etc...

DataSet

We conduct experiments on commonly used incremental learning bencnmarks: CIFAR100, miniImageNet.

  1. CIFAR100 is available at cs.toronto.edu. Download CIFAR100 and put it under the dataset directory.
  2. miniImageNet is available at Our Google Drive. Download miniImageNet and make it looks like:
mini-imagenet/
├── images
    ├── n0210891500001298.jpg  
    ├── n0287152500001298.jpg 
    ...
├── test.csv
├── val.csv
├── train.csv
└── imagenet_class_index.json

Models

You can download pretrained model here:

You can download test model here, then put it under the model directory:


Training

All commands should be run under the project root directory.

To train the model(s) in the paper, run this command:

sh ./main.sh --input_data dataset --pre_trained pretrained/cifar100-pretrained.pth.tar --network resnet

📋 Describe how to train the models, with example commands on how to train the models in your paper, including the full training procedure and appropriate hyperparameters.


Evaluation

To evaluate my model, run:

sh ./test.sh --model_file model/test-20classes.pth.tar

📋 Describe how to evaluate the trained models on benchmarks reported in the paper, give commands that produce the results (section below).


Acknowledgements

Special thanks to https://github.com/DRSAD/iCaRL for his iCaRL Networks implementation of which parts were used for this implementation. More details of iCaRL at https://arxiv.org/abs/1611.07725


About

Class Incremental Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0