EconPapers    
Economics at your fingertips  
 

Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)

Huynh Ngai () and Ta Anh Son ()
Additional contact information
Huynh Ngai: University of Quy Nhon
Ta Anh Son: Ha Noi University of Science and Technology

Computational Optimization and Applications, 2022, vol. 83, issue 2, No 8, 615-649

Abstract: Abstract The accelerated gradient method initiated by Nesterov is now recognized to be one of the most powerful tools for solving smooth convex optimization problems. This method improves significantly the convergence rate of function values from O(1/k) of the standard gradient method down to $$O(1/k^2)$$ O ( 1 / k 2 ) . In this paper, we present two generalized variants of Nesterov’s accelerated proximal gradient method for solving composition convex optimization problems in which the objective function is represented by the sum of a smooth convex function and a nonsmooth convex part. We show that with suitable ways to pick the sequences of parameters, the convergence rate for the function values of this proposed method is actually of order $$o(1/k^2)$$ o ( 1 / k 2 ) . Especially, when the objective function is p-uniformly convex for $$p>2$$ p > 2 , the convergence rate is of order $$O\left( \ln k/k^{2p/(p-2)}\right)$$ O ln k / k 2 p / ( p - 2 ) , and the convergence is linear if the objective function is strongly convex. By-product, we derive a forward–backward algorithm generalizing the one by Attouch–Peypouquet (SIAM J Optim 26(3):1824–1834, 2016), which produces a convergence sequence with a convergence rate of the function values of order $$o(1/k^2)$$ o ( 1 / k 2 ) . Initial computational experiments for solving linear inverse problems with the $$l_1$$ l 1 -regularization demonstrate the capabilities of the proposed algorithms.

Keywords: Convex optimization; Forward–backward method; Nesterov accelerated gradient method; Proximal mapping; Subdifferential; 49J52; 90C26; 90C30; 49M37; 65K05; 90C25 (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s10589-022-00401-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:coopap:v:83:y:2022:i:2:d:10.1007_s10589-022-00401-y

Ordering information: This journal article can be ordered from
http://www.springer.com/math/journal/10589

DOI: 10.1007/s10589-022-00401-y

Access Statistics for this article

Computational Optimization and Applications is currently edited by William W. Hager

More articles in Computational Optimization and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:coopap:v:83:y:2022:i:2:d:10.1007_s10589-022-00401-y