EconPapers    
Economics at your fingertips  
 

Subgradient sampling for nonsmooth nonconvex minimization

Tam Le, Jérôme Bolte and Edouard Pauwels

No 22-1310, TSE Working Papers from Toulouse School of Economics (TSE)

Abstract: Risk minimization for nonsmooth nonconvex problems naturally leads to firstorder sampling or, by an abuse of terminology, to stochastic subgradient descent. We establish the convergence of this method in the path-differentiable case, and describe more precise results under additional geometric assumptions. We recover and improve results from Ermoliev-Norkin [27] by using a different approach: conservative calculus and the ODE method. In the definable case, we show that first-order subgradient sampling avoids artificial critical point with probability one and applies moreover to a large range of risk minimization problems in deep learning, based on the backpropagation oracle. As byproducts of our approach, we obtain several results on integration of independent interest, such as an interchange result for conservative derivatives and integrals, or the definability of set-valued parameterized integrals.

Date: 2022-02
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.tse-fr.eu/sites/default/files/TSE/docu ... 2022/wp_tse_1310.pdf Working paper (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:tse:wpaper:126674

Access Statistics for this paper

More papers in TSE Working Papers from Toulouse School of Economics (TSE) Contact information at EDIRC.
Bibliographic data for series maintained by ().

 
Page updated 2025-04-01
Handle: RePEc:tse:wpaper:126674