Cyclic Stochastic Gradient Descent Method
Zhijie Xie () and
Cong Sun ()
Additional contact information
Zhijie Xie: Beijing University of Posts and Telecommunications
Cong Sun: Beijing University of Posts and Telecommunications
Journal of Optimization Theory and Applications, 2026, vol. 208, issue 1, No 42, 32 pages
Abstract:
Abstract Stochastic gradient descent (SGD) method is a commonly used optimization method in machine learning. Its stepsize is a crucial factor for convergence property. The cyclic stepsize update strategy for SGD is proposed, where the approximated Cauchy step and the constant stepsize are combined. The current Cauchy step is approximated by the BB step in the next iteration. Combining with both monotone and nonmonotone linesearches, we establish the convergence results for the cyclic SGD method. The convergence analysis for different types of problems are provided. Compared to the theoretical results in literatures, the convergence assumptions for convex and strongly convex problems are weaker, where the impractical interpolation condition assumption is removed. Numerical experiments show that the proposed stepsize easily satisfies the linesearch requirement; the proposed method outperforms the benchmark methods, and enjoys the insensitivity to initialization.
Keywords: stochastic gradient descent; cyclic gradient method; nonmonotone linesearch; machine learning; 49J53; 49K99; more (search for similar items in EconPapers)
Date: 2026
References: Add references at CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10957-025-02867-2 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joptap:v:208:y:2026:i:1:d:10.1007_s10957-025-02867-2
Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10957/PS2
DOI: 10.1007/s10957-025-02867-2
Access Statistics for this article
Journal of Optimization Theory and Applications is currently edited by Franco Giannessi and David G. Hull
More articles in Journal of Optimization Theory and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().