EconPapers    
Economics at your fingertips  
 

FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures

Sébastien Gadat, Yohann De Castro and Clément Marteau

No 23-1494, TSE Working Papers from Toulouse School of Economics (TSE)

Abstract: This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in con-junction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the fol-lowing key findings: (i) The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; (ii) We establish a global convergence guarantee with a convergence rate of O(log(K)/√K) over K iterations, showcasing the efficiency and effectiveness of our algorithm, (iii) Additionally, we analyze and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm’s behavior and reliability in practical applications.

Date: 2023-12-11
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.tse-fr.eu/sites/default/files/TSE/docu ... 2023/wp_tse_1494.pdf Full Text (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:tse:wpaper:128771

Access Statistics for this paper

More papers in TSE Working Papers from Toulouse School of Economics (TSE) Contact information at EDIRC.
Bibliographic data for series maintained by ().

 
Page updated 2025-04-19
Handle: RePEc:tse:wpaper:128771