EconPapers    
Economics at your fingertips  
 

Advances in Low-Memory Subgradient Optimization

Pavel E. Dvurechensky (), Alexander V. Gasnikov (), Evgeni A. Nurminski () and Fedor S. Stonyakin ()
Additional contact information
Pavel E. Dvurechensky: Weierstrass Institute for Applied Analysis and Stochastic
Alexander V. Gasnikov: Institute for Information Transmission Problems RAS
Evgeni A. Nurminski: Far Eastern Federal University
Fedor S. Stonyakin: Moscow Institute of Physics and Technology

Chapter Chapter 2 in Numerical Nonsmooth Optimization, 2020, pp 19-59 from Springer

Abstract: Abstract This chapter is devoted to the blackbox subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. To provide historical perspective this survey starts with the original result of Shor which opened this field with the application to the classical transportation problem. The theoretical complexity bounds for smooth and nonsmooth convex and quasiconvex optimization problems are briefly exposed in what follows to introduce the relevant fundamentals of nonsmooth optimization. Special attention in this section is given to the adaptive step size policy which aims to attain lowest complexity bounds. Nondifferentiability of objective function in convex optimization significantly slows down the rate of convergence in subgradient optimization compared to the smooth case, but there are different modern techniques that allow to solve nonsmooth convex optimization problems faster than dictate theoretical lower complexity bounds. In this work the particular attention is given to Nesterov smoothing technique, Nesterov universal approach, and Legendre (saddle point) representation approach. The new results on universal mirror prox algorithms represent the original parts of the survey. To demonstrate application of nonsmooth convex optimization algorithms to solution of huge-scale extremal problems we consider convex optimization problems with nonsmooth functional constraints and propose two adaptive mirror descent methods. The first method is of primal-dual variety and proved to be optimal in terms of lower oracle bounds for the class of Lipschitz continuous convex objectives and constraints. The advantages of application of this method to the sparse truss topology design problem are discussed in essential details. The second method can be used for solution of convex and quasiconvex optimization problems and it is optimal in terms of complexity bounds. The conclusion part of the survey contains the important references that characterize recent developments of nonsmooth convex optimization.

Date: 2020
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-030-34910-3_2

Ordering information: This item can be ordered from
http://www.springer.com/9783030349103

DOI: 10.1007/978-3-030-34910-3_2

Access Statistics for this chapter

More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2026-02-28
Handle: RePEc:spr:sprchp:978-3-030-34910-3_2