Residential Collegefalse
Status已發表Published
Heterogeneity-Aware Coordination for Federated Learning via Stitching Pre-trained blocks
Zhan, Shichen1; Wu, Yebo1; Tian, Chunlin1; Zhao, Yan2; Li, Li1
2024-09
Conference Name32nd IEEE/ACM International Symposium on Quality of Service, IWQoS 2024
Source Publication2024 IEEE/ACM 32nd International Symposium on Quality of Service (IWQoS)
Conference Date19-21 June 2024
Conference PlaceGuangzhou, China
CountryChina
PublisherInstitute of Electrical and Electronics Engineers Inc.
Abstract

Federated learning (FL) coordinates multiple devices to collaboratively train a shared model while preserving data privacy. However, large memory footprint and high energy consumption during the training process excludes the low-end devices from contributing to the global model with their own data, which severely deteriorates the model performance in real-world scenarios. In this paper, we propose FedStitch, a hierarchical coordination framework for heterogeneous federated learning with pre-trained blocks. Unlike the traditional approaches that train the global model from scratch, for a new task, FedStitch composes the global model via stitching pre-trained blocks. Specifically, each participating client selects the most suitable block based on their local data from the candidate pool composed of blocks from pre-trained models. The server then aggregates the optimal block for stitching. This process iterates until a new stitched network is generated. Except for the new training paradigm, FedStitch consists of the following three core components: 1) an RL-weighted aggregator, and 2) a search space optimizer deployed on the server side, and 3) a local energy optimizer deployed on each participating client. The RL-weighted aggregator helps to select the right block in the non-IID scenario, while the search space optimizer continuously reduces the size of the candidate block pool during stitching. Meanwhile, the local energy optimizer is designed to minimize the energy consumption of each client while guaranteeing the overall training progress. The results demonstrate that compared to existing approaches, FedStitch improves the model accuracy up to 20.93%. At the same time, it achieves up to 8.12× speedup, reduces the memory footprint up to 79.5%, and achieves 89.41% energy saving at most during the learning procedure.

KeywordFederated Learning Pre-training Resource-efficient Training Performance Evaluation Energy Consumption Accuracy Memory Management Quality Of Service
DOI10.1109/IWQoS61813.2024.10682959
URLView the original
Language英語English
Scopus ID2-s2.0-85206349116
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionTHE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU)
Corresponding AuthorLi, Li
Affiliation1.University of Macau, State Key Laboratory of Internet of Things for Smart City, Macao
2.Bytedance Inc., China
First Author AffilicationUniversity of Macau
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Zhan, Shichen,Wu, Yebo,Tian, Chunlin,et al. Heterogeneity-Aware Coordination for Federated Learning via Stitching Pre-trained blocks[C]:Institute of Electrical and Electronics Engineers Inc., 2024.
APA Zhan, Shichen., Wu, Yebo., Tian, Chunlin., Zhao, Yan., & Li, Li (2024). Heterogeneity-Aware Coordination for Federated Learning via Stitching Pre-trained blocks. 2024 IEEE/ACM 32nd International Symposium on Quality of Service (IWQoS).
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhan, Shichen]'s Articles
[Wu, Yebo]'s Articles
[Tian, Chunlin]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhan, Shichen]'s Articles
[Wu, Yebo]'s Articles
[Tian, Chunlin]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhan, Shichen]'s Articles
[Wu, Yebo]'s Articles
[Tian, Chunlin]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.