Communication-efficient federated learning method via redundant data elimination

To address the influence of limited network bandwidth of edge devices on the communication efficiency of federated learning, and efficiently transmit local model update to complete model aggregation, a communication-efficient federated learning method via redundant data elimination was proposed.The...

Full description

Saved in:
Bibliographic Details
Main Authors: Kaiju LI, Qiang XU, Hao WANG
Format: Article
Language:zho
Published: Editorial Department of Journal on Communications 2023-05-01
Series:Tongxin xuebao
Subjects:
Online Access:http://www.joconline.com.cn/zh/article/doi/10.11959/j.issn.1000-436x.2023072/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To address the influence of limited network bandwidth of edge devices on the communication efficiency of federated learning, and efficiently transmit local model update to complete model aggregation, a communication-efficient federated learning method via redundant data elimination was proposed.The essential reasons for generation of redundant update parameters and according to non-IID properties and model distributed training features of FL were analyzed, a novel sensitivity and loss function tolerance definitions for coreset was given, and a novel federated coreset construction algorithm was proposed.Furthermore, to fit the extracted coreset, a novel distributed adaptive sparse network model evolution mechanism was designed to dynamically adjust the structure and the training model size before each global training iteration, which reduced the number of communication bits between edge devices and the server while also guarantees the training model accuracy.Experimental results show that the proposed method achieves 17% reduction in communication bits transmission while only 0.5% degradation in model accuracy compared with state-of-the-art method.
ISSN:1000-436X