EconPapers    
Economics at your fingertips  
 

A Projected Subgradient Method for Nonsmooth Problems

Alexander Zaslavski
Additional contact information
Alexander Zaslavski: Israel Institute of Technology

Chapter Chapter 12 in Convex Optimization with Computational Errors, 2020, pp 321-354 from Springer

Abstract: Abstract In this chapter we study the convergence of the projected subgradient method for a class of constrained optimization problems in a Hilbert space. For this class of problems, an objective function is assumed to be convex but a set of admissible points is not necessarily convex. Our goal is to obtain an ๐œ–-approximate solution in the presence of computational errors, where ๐œ– is a given positive number.

Date: 2020
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-37822-6_12

Ordering information: This item can be ordered from
http://www.springer.com/9783030378226

DOI: 10.1007/978-3-030-37822-6_12

Access Statistics for this chapter

More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-01
Handle: RePEc:spr:spochp:978-3-030-37822-6_12