EconPapers    
Economics at your fingertips  
 

Subgradient Projection Algorithm

Alexander J. Zaslavski
Additional contact information
Alexander J. Zaslavski: The Technion – Israel Institute of Technology

Chapter Chapter 2 in Numerical Optimization with Computational Errors, 2016, pp 11-40 from Springer

Abstract: Abstract In this chapter we study the subgradient projection algorithm for minimization of convex and nonsmooth functions and for computing the saddle points of convex–concave functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.

Keywords: Subgradient Projection Algorithm; Convex-concave Function; Computational Errors; Small Positive Constant; Good Approximate Solution (search for similar items in EconPapers)
Date: 2016
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-319-30921-7_2

Ordering information: This item can be ordered from
http://www.springer.com/9783319309217

DOI: 10.1007/978-3-319-30921-7_2

Access Statistics for this chapter

More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-01
Handle: RePEc:spr:spochp:978-3-319-30921-7_2