Subgradient Methods
Adil Bagirov (),
Napsu Karmitsa () and
Marko M. Mäkelä ()
Additional contact information
Adil Bagirov: School of Information Technology and Mathematical Sciences, University of Ballarat
Napsu Karmitsa: University of Turku
Marko M. Mäkelä: University of Turku
Chapter Chapter 10 in Introduction to Nonsmooth Optimization, 2014, pp 295-297 from Springer
Abstract:
Abstract The history of subgradient methods (Kiev methods) starts in the 1960s and they were mainly developed in the Soviet Union. The basic idea behind subgradient methods is to generalize smooth methods by replacing the gradient with an arbitrary subgradient. Due to this simple structure, they are widely used NSO methods, although they may suffer from some serious drawbacks (this is true especially with the simplest versions of subgradient methods). The first method to be considered in this chapter is the cornerstone of NSO, the standard subgradient method. Then the ideas of the more sophisticated subgradient method, the well-known Shor’s r-algorithm are introduced.
Keywords: Standard Subgradient Method; Arbitrary Subgradient; Smooth Method; Implementable Stopping Criterion; Predetermined Step Size (search for similar items in EconPapers)
Date: 2014
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-319-08114-4_10
Ordering information: This item can be ordered from
http://www.springer.com/9783319081144
DOI: 10.1007/978-3-319-08114-4_10
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().