Dimension-Independent Convergence Rate for Adagrad with Heavy-Ball Momentum
Kyunghun Nam and
Sejun Park ()
Additional contact information
Kyunghun Nam: Department of Artificial Intelligence, Korea University, Seoul 02841, Republic of Korea
Sejun Park: Department of Artificial Intelligence, Korea University, Seoul 02841, Republic of Korea
Mathematics, 2025, vol. 13, issue 4, 1-18
Abstract:
In this study, we analyze the convergence rate of Adagrad with momentum for non-convex optimization problems. We establish the first dimension-independent convergence rate under the ( L 0 , L 1 ) -smoothness assumption, which is a generalization of the standard L -smoothness. We show the O ( 1 / T ) convergence rate under bounded noise in stochastic gradients, where the bound can scale with the current optimality gap and gradient norm.
Keywords: non-convex optimization; high-probability convergence rate; Adagrad (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/13/4/681/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/4/681/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:4:p:681-:d:1594925
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().