8000 GitHub - gordonrust/Renofeation
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

gordonrust/Renofeation

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Don't fine-tune, Renofeate

(Icon made by Eucalyp perfect from www.flaticon.com)

In our recent paper "Improving the Adversarial Robustness of Transfer Learning via Noisy Feature Distillation", we show that numerous fine-tuning methods are vulnerable to adversarial examples based on the pre-trained model. This poses security concerns for indutrial applications that are based on fine-tuning such as Google's Cloud AutoML and Microsoft's Custom Vision.

To combat such attacks, we propose Re-training with noisy feature distillation or Renofeation. Renofeation does not start training from pre-trained weights but rather re-initialize the weights and train with noisy feature distillation. To instantiate noisy feature distillation, we incorporate spatial dropout and stochastic weight averaging with feature distillation to avoid over-fitting to the pre-trained feature without hurting the generalization performance, which in turn improves the robustness.

To this end, we demonstrate empirically that the attack success rate can be reduced from 74.3%, 65.7%, 70.3%, and 50.75% down to 6.9%, 4.4%, 4.9%, and 6.9% for ResNet-18, ResNet-50, ResNet-101, and MobileNetV2, respectively. Moreover, the clean-data performance is comparable to the fine-tuning baseline!

For more details and an ablation study of our proposed method, please check out our paper!

Dependency

  • PyTorch 1.0.0
  • TorchVision 0.4.0
  • AdverTorch 0.2.0

Preparing datasets

Layout should be the following for the dataloader to load correctly

CUB_200_2011/
|    README
|    bounding_boxes.txt
|    classes.txt
|    image_class_labels.txt
|    images.txt
|    train_test_split.txt
|--- attributes
|--- images/
|--- parts/
|--- train/
|--- test/
Flower_102/
|    imagelabels.mat
|    setid.mat
|--- jpg/
stanford_dog/
|    file_list.mat
|    test_list.mat
|    train_list.mat
|--- train/
|--- test/
|--- Images/
|--- Annotation/
stanford_40/
|    attributes.txt
|--- ImageSplits/
|--- JPEGImages/
|--- MatlabAnnotations/
|--- XMLAnnotations/
MIT_67/
|    TrainImages.txt
|    TestImages.txt
|--- Annotations/
|--- Images/
|--- test/
|--- train/

Model training and evaluation

Be sure to modify the data path for each datasets for successful runs

The scripts for training and evaluating DELTA, Renofeation, and Re-training are under scripts/.

Citation

@article{chin2020renofeation,
  title={Improving the Adversarial Robustness of Transfer Learning via Noisy Feature Distillation},
  author={Chin, Ting-Wu and Zhang, Cha and Marculescu, Diana},
  journal={arXiv preprint arXiv:2002.02998},
  year={2020}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 85.3%
  • Shell 14.7%
0