An Asynchronous Federated Learning Algorithm Based on a Backup Update of Model Version Parameters

Yu Sun, Hui Li, Ying Shen, Jianfei Xie, Yanan Zhao, Xin Gao,Nong Si

2023 3rd International Conference on Electrical Engineering and Control Science (IC2ECS)(2023)

引用 0|浏览6
暂无评分
摘要
Federated Learning (FL) is a practical approach to alleviate the problem of “Data Island” pain caused by privacy leakage in artificial intelligence (AI) security research. It enables multiple users to train a shared machine learning model cooperatively without uploading private data sets. However, in the process of FL model training, effectively coordinating the communication between participants and clients is the key to improving model training performance. The existing FL algorithms cannot balance model performance and accuracy stability. This paper proposes a novel FL algorithm based on a backup update of model version parameters (FedAvu). Adding a model version to help the server and client execute different training processes can improve the training efficiency and ensure the stability of the training accuracy. Experimental test results show that the proposed method can achieve better training efficiency on MNIST and CIFAR-10 data sets than existing algorithms.
更多
查看译文
关键词
Machine learning,Artificial Intelligence,Federated learning,Privacy protection,Asynchronous update,Data leakage
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要