Skip to content

[AAAI 2024] Official Implementation of Language-Guided Transformer for Federated Multi-Label Classification

License

Notifications You must be signed in to change notification settings

Jack24658735/FedLGT

Repository files navigation

License Framework arXiv

Language-Guided Transformer for
Federated Multi-Label Classification

Language-Guided Transformer for Federated Multi-Label Classification

I-Jieh Liu, Ci-Siang Lin, Fu-En Yang, Yu-Chiang Frank Wang

Official implementation of Language-Guided Transformer for Federated Multi-Label Classification.

Abstract

Federated Learning (FL) is an emerging paradigm that enables multiple users to collaboratively train a robust model in a privacy-preserving manner without sharing their private data. Most existing approaches of FL only consider traditional single-label image classification, ignoring the impact when transferring the task to multi-label image classification. Nevertheless, it is still challenging for FL to deal with user heterogeneity in their local data distribution in the real-world FL scenario, and this issue becomes even more severe in multi-label image classification. Inspired by the recent success of Transformers in centralized settings, we propose a novel FL framework for multi-label classification. Since partial label correlation may be observed by local clients during training, direct aggregation of locally updated models would not produce satisfactory performances. Thus, we propose a novel FL framework of Language-Guided Transformer (FedLGT) to tackle this challenging task, which aims to exploit and transfer knowledge across different clients for learning a robust global model. Through extensive experiments on various multi-label datasets (e.g., FLAIR, MS-COCO, etc.), we show that our FedLGT is able to achieve satisfactory performance and outperforms standard FL techniques under multi-label FL scenarios.

Update

  • (2023/12/18) Code for FedLGT is released.
  • (2023/12/10) Code for FedLGT is coming soon. Stay tuned!

Framework Overview

Setup

  1. Please install your PyTorch version according to your CUDA version. For more details, please refer to PyTorch.

    • Sample command:
      pip install torch==1.13.0+cu116 torchvision==0.14.0+cu116 -f https://download.pytorch.org/whl/torch_stable.html
      
  2. Run the following command to install the required packages.

    pip install -r requirements.txt
    

Data Preparation

FLAIR

  1. Please refer to FLAIR official repository for data preparation. Note that please run the script to generate the hdf5 file for FLAIR dataset. For more details, please refer to prepare-hdf5 part in FLAIR.
  2. Run our pre-processing commands:
    python3 ./data/build_text_feat.py
    python3 ./data/build_label_mapping.py
    
  3. Now you should have all the data files under the folder data.

COCO/PASCAL VOC

Please refer to C-Tran to obtain these datasets.

Training and Evaluation

  • Run the following command to perform FL training on FLAIR.
    • Coarse-grained
    # Coarse-grained 
    python fed_main.py --batch_size 16  --lr 0.0001 --optim 'adam' --layers 3  --dataset 'flair_fed' \ 
                        --use_lmt --grad_ac_step 1 --dataroot data/ --epochs 5 --n_parties 50 --comm_round 50 \ 
                        --learn_emb_type clip --agg_type fedavg --coarse_prompt_type concat --use_global_guide
    • Fine-grained
    # Fine-grained
    python fed_main.py --batch_size 16  --lr 0.0001 --optim 'adam' --layers 3  --dataset 'flair_fed' \ 
                    --use_lmt --grad_ac_step 1 --dataroot data/ --epochs 5 --n_parties 50 --comm_round 50 \
                    --learn_emb_type clip --agg_type fedavg --coarse_prompt_type concat --flair_fine --use_global_guide
  • Evaluation
    • Follow the same comamnd as training, but add --inference flag. For example:
    # Coarse-grained
    python fed_main.py  --layers 3 --dataset 'flair_fed' \ 
                        --use_lmt --dataroot data/ --n_parties 1 \ 
                        --learn_emb_type clip --coarse_prompt_type concat --use_global_guide --inference

Acknowledgement

We build our FedLGT codebase on the codebases of C-Tran and NIID-Bench. We sincerely thank for their wonderful works.

Citation

If you find this useful for your research, please consider citing:

@inproceedings{liu2024fedlgt,
  author    = {I-Jieh Liu and Ci-Siang Lin and Fu-En Yang and Yu-Chiang Frank Wang},
  title     = {Language-Guided Transformer for Federated Multi-Label Classification},
  booktitle = {AAAI},
  year      = {2024},
}

Contact

If you have any questions about this project, please feel free to contact liujack0914@gmail.com.

About

[AAAI 2024] Official Implementation of Language-Guided Transformer for Federated Multi-Label Classification

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages