D2D computation task offloading for efficient federated learning

Federated learning is a kind of distributed machine learning technique.The factor of communication and computation resource constraints at the edge node is becoming the performance bottleneck.In particular,when different edge node has distinct computation and communication capabilities,the model tra...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaoran CAI, Xiaopeng MO, Jie XU
Format: Article
Language:zho
Published: China InfoCom Media Group 2019-12-01
Series:物联网学报
Subjects:
Online Access:http://www.wlwxb.com.cn/zh/article/doi/10.11959/j.issn.2096-3750.2019.00135/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Federated learning is a kind of distributed machine learning technique.The factor of communication and computation resource constraints at the edge node is becoming the performance bottleneck.In particular,when different edge node has distinct computation and communication capabilities,the model training performance may degrade severely,thus necessitating the joint communication and computation optimization.To tackle this challenge,a computational task offloading scheme enabled by device-to-device (D2D) communications was proposed,in which different edge node exchanged data samples via D2D communication links to balance the processing capability and task load,in order to minimize the total time delay for machine learning model training.Simulation results show that compared to the benchmark scheme without such D2D task offloading the training speed and efficiency of federated learning has be improved significantly.
ISSN:2096-3750