EconPapers    
Economics at your fingertips  
 

A locally sequentially reweighted gradient descent estimator to enhance statistical efficiency for decentralized federated learning

Yue Chen (), Peng Lin () and Baoxue Zhang ()
Additional contact information
Yue Chen: Capital University of Economics and Business
Peng Lin: Capital University of Economics and Business
Baoxue Zhang: Capital University of Economics and Business

Computational Statistics, 2025, vol. 40, issue 9, No 29, 5677-5728

Abstract: Abstract Federated learning has become increasingly important for training collaborative models due to its privacy-preserving capabilities. While many studies have considered the numerical convergence of federated learning algorithms, far less attention has been given to their statistical convergence. In this paper, to enhance statistical efficiency, we propose a novel Locally Sequentially Re-weighted Gradient Descent (LSRGD) estimator for decentralized federated learning. Furthermore, we show that under a linear regression model with homogeneous data, the LSRGD estimator is asymptotically normal. With an appropriate learning rate, the LSRGD estimator achieves statistical efficiency comparable to that of the global estimator. We then extend this optimal statistical efficiency to general models and loss functions for heterogeneous data. Moreover, we also propose a parallel version of the LSRGD algorithm, referred to as LSRGD-P, which exhibits numerical and statistical convergence properties similar to those of LSRGD. Finally, extensive experiments demonstrate that the proposed LSRGD and LSRGD-P estimators exhibit superior statistical efficiency compared to existing competitors, while maintaining comparable convergence speeds and computational efficiency. This advantage is particularly pronounced in scenarios where sample sizes vary significantly across different clients, highlighting the robustness and adaptability of the proposed methods in heterogeneous federated learning environments with imbalanced data.

Keywords: Imbalanced data; Gradient descent; Statistical efficiency; Re-weighting strategy; Deep learning (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s00180-025-01667-6 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:compst:v:40:y:2025:i:9:d:10.1007_s00180-025-01667-6

Ordering information: This journal article can be ordered from
http://www.springer.com/statistics/journal/180/PS2

DOI: 10.1007/s00180-025-01667-6

Access Statistics for this article

Computational Statistics is currently edited by Wataru Sakamoto, Ricardo Cao and Jürgen Symanzik

More articles in Computational Statistics from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-11-18
Handle: RePEc:spr:compst:v:40:y:2025:i:9:d:10.1007_s00180-025-01667-6