Skip to content

da2so/Zero-shot_Knowledge_Distillation_Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Zero-Shot Knowledge Distillation in Deep Networks Pytorch

Python version support PyTorch version support

⭐ Star us on GitHub — it helps!!

PyTorch implementation for Zero-Shot Knowledge Distillation in Deep Networks

Install

You will need a machine with a GPU and CUDA installed.
Then, you prepare runtime environment:

pip install -r requirements.txt

Use

For mnist dataset,

python main.py --dataset=mnist --t_train=False --num_sample=12000 --batch_size=200 

For cifar10 dataset,

python main.py --dataset=cifar10 --t_train=False --num_sample=24000 --batch_size=100

Arguments:

  • dataset - available dataset: ['mnist', 'cifar10', 'cifar100']
  • t_train - Train teacher network??
    • if True, train teacher network
    • elif False, load trained teacher network
  • num_sample - Number of DIs crafted per category
  • beta - Beta scaling vectors
  • batch_size - batch size
  • lr - learning rate
  • iters - iteration number
  • s_save_path - save path for student network
  • do_genimgs - generate synthesized images from ZSKD??
    • if True, generate images
    • elif False, you must have the synthesized images that are generated from ZSKD

Result examples for MNIST dataset

2

Understanding this method(algorithm)

✅ Check my blog!! Here

Releases

No releases published

Packages

No packages published

Languages