Pytorch implementations of some general federated optimization methods.
FL-Simulator works on one single CPU/GPU to simulate the training process of federated learning (FL) with the PyTorch framework. If you want to train the centralized-FL with FedAvg method on the ResNet-18 and Cifar-10 dataset (10% active clients per round of total 100 clients, and heterogeneous dataset split is Dirichlet-0.6), you can use:
python train.py --non-iid --dataset CIFAR-10 --model ResNet18 --split-rule Dirichlet --split-coef 0.6 --active-ratio 0.1 --total-client 100
Other hyperparameters are introduced in the train.py file.
FL-Simulator pre-define the basic Server class and Client class, which are executed according to the vanilla
-
process_for_communication( ):
how your method preprocesses the communication variables
-
global_update( ):
how your method processes the update on the global model
-
postprocess( ):
how your method processes the received variables from local clients
Then, you can define a new client file for this new method.
CIFAR-10 (ResNet-18-GN) T=1000 | ||||||||
10%-100 (bs=50 Local-epoch=5) | 5%-200 (bs=25 Local-epoch=5) | |||||||
IID | Dir-0.6 | Dir-0.3 | Dir-0.1 | IID | Dir-0.6 | Dir-0.3 | Dir-0.1 | |
FedAvg | 82.52 | 80.65 | 79.75 | 77.31 | 81.09 | 79.93 | 78.66 | 75.21 |
FedProx | 82.54 | 81.05 | 79.52 | 76.86 | 81.56 | 79.49 | 78.76 | 75.84 |
FedAdam | 84.32 | 82.56 | 82.12 | 77.58 | 83.29 | 81.22 | 80.22 | 75.83 |
SCAFFOLD | 84.88 | 83.53 | 82.75 | 79.92 | 84.24 | 83.01 | 82.04 | 78.23 |
FedDyn | 85.46 | 84.22 | 83.22 | 78.96 | 81.11 | 80.25 | 79.43 | 75.43 |
FedCM | 85.74 | 83.81 | 83.44 | 78.92 | 83.77 | 82.01 | 80.77 | 75.91 |
MoFedSAM | 87.24 | 85.74 | 85.14 | 81.58 | 86.27 | 84.71 | 83.44 | 79.02 |
FedSpeed | 87.72 | 86.05 | 85.25 | 82.05 | 86.87 | 85.07 | 83.94 | 79.66 |
- Decentralized FL Implementation.
If this codebase can help you, please cite our paper FedSpeed:
@article{sun2023fedspeed,
title={Fedspeed: Larger local interval, less communication round, and higher generalization accuracy},
author={Sun, Yan and Shen, Li and Huang, Tiansheng and Ding, Liang and Tao, Dacheng},
journal={arXiv preprint arXiv:2302.10429},
year={2023}
}