EconPapers    
Economics at your fingertips  
 

Convergence analysis for a nonlocal gradient descent method via directional Gaussian smoothing

Hoang Tran (), Qiang Du () and Guannan Zhang ()
Additional contact information
Hoang Tran: Oak Ridge National Laboratory
Qiang Du: Columbia University
Guannan Zhang: Oak Ridge National Laboratory

Computational Optimization and Applications, 2025, vol. 90, issue 2, No 6, 513 pages

Abstract: Abstract We analyze the convergence of a nonlocal gradient descent method for minimizing a class of high-dimensional non-convex functions, where a directional Gaussian smoothing (DGS) is proposed to define the nonlocal gradient (also referred to as the DGS gradient). The method was first proposed in [Zhang et al., Enabling long-range exploration in minimization of multimodal functions, UAI 2021], in which multiple numerical experiments showed that replacing the traditional local gradient with the DGS gradient can help the optimizers escape local minima more easily and significantly improve their performance. However, a rigorous theory for the efficiency of the method on nonconvex landscape is lacking. In this work, we investigate the scenario where the objective function is composed of a convex function, perturbed by deterministic oscillating noise. We provide a convergence theory under which the iterates exponentially converge to a tightened neighborhood of the solution, whose size is characterized by the noise wavelength. We also establish a correlation between the optimal values of the Gaussian smoothing radius and the noise wavelength, thus justifying the advantage of using moderate or large smoothing radii with the method. Furthermore, if the noise level decays to zero when approaching the global minimum, we prove that DGS-based optimization converges to the exact global minimum with linear rates, similarly to standard gradient-based methods in optimizing convex functions. Several numerical experiments are provided to confirm our theory and illustrate the superiority of the approach over those based on the local gradient.

Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s10589-024-00641-0 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:coopap:v:90:y:2025:i:2:d:10.1007_s10589-024-00641-0

Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589

DOI: 10.1007/s10589-024-00641-0

Access Statistics for this article

Computational Optimization and Applications is currently edited by William W. Hager

More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:coopap:v:90:y:2025:i:2:d:10.1007_s10589-024-00641-0