Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
Damek Davis () and
Dmitriy Drusvyatskiy ()
Additional contact information
Damek Davis: School of Operations Research and Information Engineering, Cornell University, Ithaca, New York 14850
Dmitriy Drusvyatskiy: Department of Mathematics, University of Washington, Seattle, Washington 98195
Mathematics of Operations Research, 2022, vol. 47, issue 1, 209-231
Abstract:
We investigate the stochastic optimization problem of minimizing population risk, where the loss defining the risk is assumed to be weakly convex. Compositions of Lipschitz convex functions with smooth maps are the primary examples of such losses. We analyze the estimation quality of such nonsmooth and nonconvex problems by their sample average approximations. Our main results establish dimension-dependent rates on subgradient estimation in full generality and dimension-independent rates when the loss is a generalized linear model. As an application of the developed techniques, we analyze the nonsmooth landscape of a robust nonlinear regression problem.
Keywords: Primary: 90C15; secondary: 68Q32; 65K10; subdifferential; stability; population risk; sample average approximation; weak convexity; Moreau envelope; graphical convergence (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations:
Downloads: (external link)
http://dx.doi.org/10.1287/moor.2021.1126 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:ormoor:v:47:y:2022:i:1:p:209-231
Access Statistics for this article
More articles in Mathematics of Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().