Skip to content

(CVPR 2024) Official Implementation of "FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning"

License

Notifications You must be signed in to change notification settings

Lee-Gihun/FedSOL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FedSOL (Federated Stabilized Orthogonal Learning)

This repository is the official PyTorch implementation of:

"FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning (CVPR 2024)".

Requirements

  • This codebase is written for python3 (used python 3.8.8 while implementing).
  • We use Pytorch version of 1.9.0 and 10.2, 11.0 CUDA version.
  • To install necessary python packages,
    pip install -r requirements.txt
    

How to Run Codes?

The configuration skeleton for each algorithm is in ./configs/*.json.

  • python ./main.py --config_path ./configs/algorithm_name.json conducts the experiment with the default setups.

There are two ways to change the configurations:

  1. Change (or Write a new one) the configuration file in ./config directory with the above command.
  2. Use parser arguments to overload the configuration file.
  • --dataset_name: name of the datasets (e.g., mnist, cifar10).
    • for cinic-10 datasets, the data should be downloaded first using ./data/cinic10/download.sh.
  • --n_clients: the number of total clients (default: 100).
  • --batch_size: the size of batch to be used for local training. (default: 50)
  • --partition_method: non-IID partition strategy (e.g. sharding, lda).
  • --partition_s: shard per user (only for sharding).
  • --partition_alpha: concentration parameter alpha for latent Dirichlet Allocation (only for lda).
  • --model_name: model architecture to be used (e.g., fedavg_mnist, fedavg_cifar).
  • --n_rounds: the number of total communication rounds. (default: 300)
  • --sample_ratio: fraction of clients to be ramdonly sampled at each round (default: 0.1)
  • --local_epochs: the number of local epochs (default: 5).
  • --lr: the initial learning rate for local training (default: 0.01)
  • --momentum: the momentum for SGD (default: 0.9).
  • --wd: weight decay for optimization (default: 1e-5)
  • -- algo_name: algorithm name of the experiment (e.g., fedavg, fedsol_adaptive)
  • --seed: random seed

Notes

As this code has been refactored for efficiency and readability, performance may potentially differ slightly from the reported tendency in some cases. If you find discrepancies, please open an issue.

Reference Github

We refer to the following repositories:

Citing this work

@inproceedings{lee2024fedsol,
  title={FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning},
  author={Lee, Gihun and Jeong, Minchan and Kim, Sangmook and Oh, Jaehoon and Yun, Se-Young},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={12512--12522},
  year={2024}
}