Skip to content

GAIA-vision/GAIA-ssl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

66 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GAIA-ssl

An AutoML toolbox specialized in contrastive learning.

Install

requirements:

torch 1.8.0

gaiavision

mmcv-full 1.3.0

Command

Supernet training

CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 bash tools/dist_train.sh app/dynmoco/configs/local/ar50to101_10pc_bs64_200_epoch.py 8

This is the checkpoint we use in our paper. And don't forget to change the data path in all config files before running these commands.

Feature similarity computatoin

For classification downstream tasks:

CUDA_VISIBLE_DEVICES=0,1,2,3 bash tools/dist_search.sh app/dynmoco/configs/local/supernet_search.py /path/to_supernet_ckpt workdir 4

For dense prediction downstream tasks:

CUDA_VISIBLE_DEVICES=0,1,2,3 bash tools/dist_search.sh app/dynmoco/configs/local/supernet_dense_search.py /path/to_supernet_ckpt workdir 4 --dense True

Extract subnet

Change the R_specific in app/dynmoco/configs/local/specific_extract.py according your need, then:

CUDA_VISIBLE_DEVICES=0 bash tools/dist_extract_from_supernet.sh /path/to_supernet_ckpt subnet.pth app/dynmoco/configs/local/specific_extract.py 1

Extract backbone from this generated subnet pth:

python tools/extract_backbone_weights.py subnet.pth backbone.pth

Precautions

FP16 and gradient accumulate can be used in original openself repo, but they can not be used in this version.

Citation

If you find this project useful in your research, please consider cite:

@misc{chang2022data,
      title={DATA: Domain-Aware and Task-Aware Self-supervised Learning}, 
      author={Qing Chang and Junran Peng and Lingxie Xie and Jiajun Sun and Haoran Yin and Qi Tian and Zhaoxiang Zhang},
      year={2022},
      eprint={2203.09041},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published