Skip to content
/ LTRL Public

πŸ”₯[ECCV 2024, Oral, Official] LTRL: Boosting Long-tail Recognition via Reflective Learning

Notifications You must be signed in to change notification settings

fistyee/LTRL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

15 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LTRL: Boosting Long-tail Recognition via Reflective Learning

[Official, ECCV 2024 (Oral), paper πŸ”₯

1 Beijing University of Chemical Technology

2 Nanyang Technological University

3 Xidian University

4 Singapore University of Technology and Design

5 Lancaster University

(* Equal contribution)

The framework of LTRL

LTRL

1. Requirements

  • To install requirements:
pip install -r requirements.txt
  • Hardware requirements 8 GPUs with >= 12G GPU RAM are recommended. Otherwise the model with more experts may not fit in, especially on datasets with more classes (the FC layers will be large). We do not support CPU training, but CPU inference could be supported by slight modification.

2. Datasets

(1) Four bechmark datasets

  • Please download these datasets and put them to the /data file.
  • ImageNet-LT and Places-LT can be found at here.
  • iNaturalist data should be the 2018 version from here.
  • CIFAR-100 will be downloaded automatically with the dataloader.
data
β”œβ”€β”€ ImageNet_LT
β”‚Β Β  β”œβ”€β”€ test
β”‚Β Β  β”œβ”€β”€ train
β”‚Β Β  └── val
β”œβ”€β”€ CIFAR100
β”‚Β Β  └── cifar-100-python
β”œβ”€β”€ CIFAR10
β”‚Β Β  └── cifar-10-python
β”œβ”€β”€ Place365
β”‚Β Β  β”œβ”€β”€ data_256
β”‚Β Β  β”œβ”€β”€ test_256
β”‚Β Β  └── val_256
└── iNaturalist 
 Β Β  β”œβ”€β”€ test2018
    └── train_val2018

(2) Txt files

  • We provide txt files for test-agnostic long-tailed recognition for ImageNet-LT, Places-LT and iNaturalist 2018. CIFAR-100 will be generated automatically with the code.
  • For iNaturalist 2018, please unzip the iNaturalist_train.zip.
data_txt
β”œβ”€β”€ ImageNet_LT
β”‚Β Β  β”œβ”€β”€ ImageNet_LT_test.txt
β”‚Β Β  β”œβ”€β”€ ImageNet_LT_train.txt
β”‚Β Β  └── ImageNet_LT_val.txt
β”œβ”€β”€ Places_LT_v2
β”‚Β Β  β”œβ”€β”€ Places_LT_test.txt
β”‚Β Β  β”œβ”€β”€ Places_LT_train.txt
β”‚Β Β  └── Places_LT_val.txt
└── iNaturalist18
    β”œβ”€β”€ iNaturalist18_train.txt
    └── iNaturalist18_val.txt 

3. Pretrained models

  • For the training on Places-LT, we follow previous methods and use the pre-trained ResNet-152 model.
  • Please download the checkpoint. Unzip and move the checkpoint files to /model/pretrained_model_places/.

4. Train

Train SADE_RL/BSCE_RL

(1) CIFAR100-LT

nohup python train.py -c configs/{sade or bsce}/config_cifar100_ir10_{sade or ce}_rl.json &>{sade or ce}_rl_10.out&
nohup python train.py -c configs/{sade or bsce}/config_cifar100_ir50_{sade or ce}_rl.json &>{sade or ce}_rl_50.out&
nohup python train.py -c configs/{sade or bsce}/config_cifar100_ir100_{sade or ce}_rl.json &>{sade or ce}_rl_100.out&

Example:
nohup python train.py -c configs/sade/config_cifar100_ir100_sade_rl.json &>sade_rl_100.out&
# test
python test.py -r {$PATH}

(2) ImageNet-LT

python train.py -c configs/{sade or bsce}/config_imagenet_lt_resnext50_{sade or ce}_rl.json

(3) Place-LT

python train.py -c configs/{sade or bsce}/config_imagenet_lt_resnext50_{sade or ce}_rl.json

(4) iNatrualist2018-LT

python train.py -c configs/{sade or bsce}/config_iNaturalist_resnet50_{sade or ce}_rl.json

Train baseline: SADE/BSCE

nohup python train.py -c configs/{sade/bsce}/config_cifar100_ir10_{sade/ce}.json &>{sade/ce}_10.out&
nohup python train.py -c configs/{sade/bsce}/config_cifar100_ir50_{sade/ce}.json &>{sade/ce}_50.out&
nohup python train.py -c configs/{sade/bsce}/config_cifar100_ir100_{sade/ce}.json &>{sade/ce}_100.out&

5. Test

python test.py -r {$PATH}

(2) ImageNet-LT

python train.py -c configs/{sade or bsce}config_imagenet_lt_resnext50_{sade or ce}.json

(3) Place-LT

python train.py -c configs/{sade or bsce}/config_imagenet_lt_resnext50_{sade or ce}_rl.json

(4) iNatrualist2018-LT

python train.py -c configs/{sade or bsce}/config_iNaturalist_resnet50_{sade or ce}_rl.json

Citation

If you find our work inspiring or use our codebase in your research, please consider giving a star ⭐ and a citation.

@article{zhao2024ltrl,
  title={LTRL: Boosting Long-tail Recognition via Reflective Learning},
  author={Zhao, Qihao and Dai, Yalun and Lin, Shen and Hu, Wei and Zhang, Fan and Liu, Jun},
  journal={arXiv preprint arXiv:2407.12568},
  year={2024}
}

Acknowledgements

The framework is based on SADE and RIDE.

About

πŸ”₯[ECCV 2024, Oral, Official] LTRL: Boosting Long-tail Recognition via Reflective Learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages