A mini-batch stochastic conjugate gradient algorithm with variance reduction
Caixia Kou () and
Han Yang ()
Additional contact information
Caixia Kou: Beijing University of Posts and Telecommunications
Han Yang: Beijing University of Posts and Telecommunications
Journal of Global Optimization, 2023, vol. 87, issue 2, No 29, 1009-1025
Abstract:
Abstract Stochastic gradient descent method is popular for large scale optimization but has slow convergence asymptotically due to the inherent variance. To remedy this problem, there have been many explicit variance reduction methods for stochastic descent, such as SVRG Johnson and Zhang [Advances in neural information processing systems, (2013), pp. 315–323], SAG Roux et al. [Advances in neural information processing systems, (2012), pp. 2663–2671], SAGA Defazio et al. [Advances in neural information processing systems, (2014), pp. 1646–1654] and so on. Conjugate gradient method, which has the same computation cost with gradient descent method, is considered. In this paper, in the spirit of SAGA, we propose a stochastic conjugate gradient algorithm which we call SCGA. With the Fletcher and Reeves type choices, we prove a linear convergence rate for smooth and strongly convex functions. We experimentally demonstrate that SCGA converges faster than the popular SGD type algorithms for four machine learning models, which may be convex, nonconvex or nonsmooth. Solving regression problems, SCGA is competitive with CGVR, which is the only one stochastic conjugate gradient algorithm with variance reduction so far, as we know.
Keywords: Deep learning; Empirical risk minimization; Stochastic conjugate gradient; Linear convergence (search for similar items in EconPapers)
Date: 2023
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10898-022-01205-4 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:jglopt:v:87:y:2023:i:2:d:10.1007_s10898-022-01205-4
Ordering information: This journal article can be ordered from
http://www.springer. ... search/journal/10898
DOI: 10.1007/s10898-022-01205-4
Access Statistics for this article
Journal of Global Optimization is currently edited by Sergiy Butenko
More articles in Journal of Global Optimization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().