Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices

Federated learning enables devices to train models collaboratively while protecting data privacy. However, the computing power, memory, and communication capabilities of IoT devices are limited, making it difficult to train large-scale models on these devices. To train large models on resource-const...

Full description

Saved in:
Bibliographic Details
Main Authors: Mengmeng Li, Xin He, Jinhua Chen
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/26/12/1099
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846104746051502080
author Mengmeng Li
Xin He
Jinhua Chen
author_facet Mengmeng Li
Xin He
Jinhua Chen
author_sort Mengmeng Li
collection DOAJ
description Federated learning enables devices to train models collaboratively while protecting data privacy. However, the computing power, memory, and communication capabilities of IoT devices are limited, making it difficult to train large-scale models on these devices. To train large models on resource-constrained devices, federated split learning allows for parallel training of multiple devices by dividing the model into different devices. However, under this framework, the client is heavily dependent on the server’s computing resources, and a large number of model parameters must be transmitted during communication, which leads to low training efficiency. In addition, due to the heterogeneous distribution among clients, it is difficult for the trained global model to apply to all clients. To address these challenges, this paper designs a sparse gradient collaborative federated learning model for heterogeneous data on resource-constrained devices. First, the sparse gradient strategy is designed by introducing the position Mask to reduce the traffic. To minimize accuracy loss, the dequantization strategy is applied to restore the original dense gradient tensor. Second, the influence of each client on the global model is measured by Euclidean distance, and based on this, the aggregation weight is assigned to each client, and an adaptive weight strategy is developed. Finally, the sparse gradient quantization method is combined with an adaptive weighting strategy, and a collaborative federated learning algorithm is designed for heterogeneous data distribution. Extensive experiments demonstrate that the proposed algorithm achieves high classification efficiency, effectively addressing the challenges posed by data heterogeneity.
format Article
id doaj-art-dedd15d7dc144ceb9daced1eda5d29fc
institution Kabale University
issn 1099-4300
language English
publishDate 2024-12-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj-art-dedd15d7dc144ceb9daced1eda5d29fc2024-12-27T14:25:11ZengMDPI AGEntropy1099-43002024-12-012612109910.3390/e26121099Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained DevicesMengmeng Li0Xin He1Jinhua Chen2College of Computer and Information Engineering, Henan University, Kaifeng 475001, ChinaHenan International Joint Laboratory of Intelligent Network Theory and Key Technology, Henan University, Kaifeng 475001, ChinaHenan International Joint Laboratory of Intelligent Network Theory and Key Technology, Henan University, Kaifeng 475001, ChinaFederated learning enables devices to train models collaboratively while protecting data privacy. However, the computing power, memory, and communication capabilities of IoT devices are limited, making it difficult to train large-scale models on these devices. To train large models on resource-constrained devices, federated split learning allows for parallel training of multiple devices by dividing the model into different devices. However, under this framework, the client is heavily dependent on the server’s computing resources, and a large number of model parameters must be transmitted during communication, which leads to low training efficiency. In addition, due to the heterogeneous distribution among clients, it is difficult for the trained global model to apply to all clients. To address these challenges, this paper designs a sparse gradient collaborative federated learning model for heterogeneous data on resource-constrained devices. First, the sparse gradient strategy is designed by introducing the position Mask to reduce the traffic. To minimize accuracy loss, the dequantization strategy is applied to restore the original dense gradient tensor. Second, the influence of each client on the global model is measured by Euclidean distance, and based on this, the aggregation weight is assigned to each client, and an adaptive weight strategy is developed. Finally, the sparse gradient quantization method is combined with an adaptive weighting strategy, and a collaborative federated learning algorithm is designed for heterogeneous data distribution. Extensive experiments demonstrate that the proposed algorithm achieves high classification efficiency, effectively addressing the challenges posed by data heterogeneity.https://www.mdpi.com/1099-4300/26/12/1099federated split learningresource-constrained devicesheterogeneous datasparse gradientadaptive weight
spellingShingle Mengmeng Li
Xin He
Jinhua Chen
Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
Entropy
federated split learning
resource-constrained devices
heterogeneous data
sparse gradient
adaptive weight
title Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
title_full Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
title_fullStr Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
title_full_unstemmed Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
title_short Federated Collaborative Learning with Sparse Gradients for Heterogeneous Data on Resource-Constrained Devices
title_sort federated collaborative learning with sparse gradients for heterogeneous data on resource constrained devices
topic federated split learning
resource-constrained devices
heterogeneous data
sparse gradient
adaptive weight
url https://www.mdpi.com/1099-4300/26/12/1099
work_keys_str_mv AT mengmengli federatedcollaborativelearningwithsparsegradientsforheterogeneousdataonresourceconstraineddevices
AT xinhe federatedcollaborativelearningwithsparsegradientsforheterogeneousdataonresourceconstraineddevices
AT jinhuachen federatedcollaborativelearningwithsparsegradientsforheterogeneousdataonresourceconstraineddevices