Oracle Complexity Separation in Convex Optimization
Anastasiya Ivanova (),
Pavel Dvurechensky (),
Evgeniya Vorontsova (),
Dmitry Pasechnyuk (),
Alexander Gasnikov (),
Darina Dvinskikh () and
Alexander Tyurin ()
Additional contact information
Anastasiya Ivanova: National Research University Higher School of Economics
Pavel Dvurechensky: Weierstrass Institute for Applied Analysis and Stochastics
Evgeniya Vorontsova: Catholic University of Louvain
Dmitry Pasechnyuk: Moscow Institute of Physics and Technology
Alexander Gasnikov: National Research University Higher School of Economics
Darina Dvinskikh: National Research University Higher School of Economics
Alexander Tyurin: National Research University Higher School of Economics
Journal of Optimization Theory and Applications, 2022, vol. 193, issue 1, No 21, 462-490
Abstract:
Abstract Many convex optimization problems have structured objective functions written as a sum of functions with different oracle types (e.g., full gradient, coordinate derivative, stochastic gradient) and different arithmetic operations complexity of these oracles. In the strongly convex case, these functions also have different condition numbers that eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve a given accuracy. Motivated by the desire to call more expensive oracles fewer times, we consider the problem of minimizing the sum of two functions and propose a generic algorithmic framework to separate oracle complexities for each function. The latter means that the oracle for each function is called the number of times that coincide with the oracle complexity for the case when the second function is absent. Our general accelerated framework covers the setting of (strongly) convex objectives, the setting when both parts are given through full coordinate oracle, as well as when one of them is given by coordinate derivative oracle or has the finite-sum structure and is available through stochastic gradient oracle. In the latter two cases, we obtain accelerated random coordinate descent and accelerated variance reduced methods with oracle complexity separation.
Keywords: First-order methods; Convex optimization; Complexity; Random coordinate descent; Stochastic gradient; 49M37; 90C25; 65K05 (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10957-022-02038-7 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:joptap:v:193:y:2022:i:1:d:10.1007_s10957-022-02038-7
Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10957/PS2
DOI: 10.1007/s10957-022-02038-7
Access Statistics for this article
Journal of Optimization Theory and Applications is currently edited by Franco Giannessi and David G. Hull
More articles in Journal of Optimization Theory and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().