A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation
Zhongchang Zhou,
Fenggang Sun,
Xiangyu Chen,
Dongxu Zhang,
Tianzhen Han and
Peng Lan ()
Additional contact information
Zhongchang Zhou: College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
Fenggang Sun: College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
Xiangyu Chen: College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
Dongxu Zhang: Taishan Intelligent Manufacturing Industry Research Institute, Tai’an 271000, China
Tianzhen Han: Network Department Optimization Center, Taian Chinamobile, Tai’an 271000, China
Peng Lan: College of Information Science and Engineering, Shandong Agricultural University, Tai’an 271018, China
Mathematics, 2023, vol. 11, issue 14, 1-15
Abstract:
Federated learning has become increasingly important for modern machine learning, especially for data privacy sensitive scenarios. Existing federated learning mainly adopts a central server-based network topology, however, the training process of which is susceptible to the central node. To address this problem, this article proposed a decentralized federated learning method based on node selection and knowledge distillation. Specifically, the central node in this method is variable, and it is selected by the indicator interaction between nodes. Meanwhile, the knowledge distillation mechanism is added to make the student model as close as possible to the teacher’s network and ensure the model’s accuracy. The experiments were conducted on the public MNIST, CIFAR-10, and FEMNIST datasets for both the Independent Identically Distribution (IID) setting and the non-IID setting. Numerical results show that the proposed method can achieve an improved accuracy as compared to the centralized federated learning method, and the computing time is reduced greatly with less accuracy loss as compared to the blockchain decentralized federated learning. Therefore, the proposed method guarantees the model effect while meeting the individual model requirements of each node and reducing the running time.
Keywords: federated learning; node selection; decentralized learning; knowledge distillation (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.mdpi.com/2227-7390/11/14/3162/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/14/3162/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:14:p:3162-:d:1197005
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().