首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:Federated Learning with Random Communication and Dynamic Aggregation
  • 本地全文:下载
  • 作者:Ruolin Huang ; Ting Lu ; Yiyang Luo
  • 期刊名称:Computer Science & Information Technology
  • 电子版ISSN:2231-5403
  • 出版年度:2021
  • 卷号:11
  • 期号:18
  • 语种:English
  • 出版社:Academy & Industry Research Collaboration Center (AIRCC)
  • 摘要:Federated Learning (FL) is a setting that allows clients to train a joint global model collaboratively while keeping data locally. Due to FL has advantages of data confidential and distributed computing, interest in this area has increased. In this paper, we designed a new FL algorithm named FedRAD. Random communication and dynamic aggregation methods are proposed for FedRAD. Random communication method enables FL system use the combination of fixed communication interval and constrained variable intervals in a single task. Dynamic aggregation method reforms aggregation weights and makes weights update automately. Both methods aim to improve model performance. We evaluated two proposed methods respectively, and compared FedRAD with three algorithms on three hyperparameters. Results at CIFAR-10 demonstrate that each method has good performance, and FedRAD can achieve higher classification accuracy than state-of-the-art FL algorithms.
  • 关键词:Federated Learning;Random Communication;Dynamic Aggregation;Self-learning;Distributed Computing
国家哲学社会科学文献中心版权所有