Stochastic Gradient Hamiltonian Monte Carlo for non-convex learning
Huy N. Chau and
Miklós Rásonyi
Stochastic Processes and their Applications, 2022, vol. 149, issue C, 341-368
Abstract:
Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) is a momentum version of stochastic gradient descent with properly injected Gaussian noise to find a global minimum. In this paper, non-asymptotic convergence analysis of SGHMC is given in the context of non-convex optimization, where subsampling techniques are used over an i.i.d. dataset for gradient updates. In contrast to Raginsky et al. (2017) and Gao et al. (2021), our results are sharper in terms of step size, variance, and independent from the number of iterations.
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0304414922000825
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:spapps:v:149:y:2022:i:c:p:341-368
Ordering information: This journal article can be ordered from
http://http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
DOI: 10.1016/j.spa.2022.04.001
Access Statistics for this article
Stochastic Processes and their Applications is currently edited by T. Mikosch
More articles in Stochastic Processes and their Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().