Restarting Frank–Wolfe: Faster Rates under Hölderian Error Bounds
Thomas Kerdreux,
Alexandre d’Aspremont () and
Sebastian Pokutta ()
Additional contact information
Thomas Kerdreux: Zuse Institute
Alexandre d’Aspremont: CNRS UMR 8548
Sebastian Pokutta: Zuse Institute
Journal of Optimization Theory and Applications, 2022, vol. 192, issue 3, No 2, 799-829
Abstract:
Abstract Conditional gradient algorithms (aka Frank–Wolfe algorithms) form a classical set of methods for constrained smooth convex minimization due to their simplicity, the absence of projection steps, and competitive numerical performance. While the vanilla Frank–Wolfe algorithm only ensures a worst-case rate of $${\mathcal {O}}(1/\epsilon )$$ O ( 1 / ϵ ) , various recent results have shown that for strongly convex functions on polytopes, the method can be slightly modified to achieve linear convergence. However, this still leaves a huge gap between sublinear $${\mathcal {O}}(1/\epsilon )$$ O ( 1 / ϵ ) convergence and linear $${\mathcal {O}}(\log 1/\epsilon )$$ O ( log 1 / ϵ ) convergence to reach an $$\epsilon $$ ϵ -approximate solution. Here, we present a new variant of conditional gradient algorithms that can dynamically adapt to the function’s geometric properties using restarts and smoothly interpolates between the sublinear and linear regimes. These interpolated convergence rates are obtained when the optimization problem satisfies a new type of error bounds, which we call strong Wolfe primal bounds. They combine geometric information on the constraint set with Hölderian error bounds on the objective function.
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10957-021-01989-7 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joptap:v:192:y:2022:i:3:d:10.1007_s10957-021-01989-7
Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10957/PS2
DOI: 10.1007/s10957-021-01989-7
Access Statistics for this article
Journal of Optimization Theory and Applications is currently edited by Franco Giannessi and David G. Hull
More articles in Journal of Optimization Theory and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().