An Extension of the Gradient Algorithm
Alexander Zaslavski
Additional contact information
Alexander Zaslavski: Israel Institute of Technology
Chapter Chapter 5 in Convex Optimization with Computational Errors, 2020, pp 151-171 from Springer
Abstract:
Abstract In this chapter we analyze the convergence of a gradient type algorithm, under the presence of computational errors, which was introduced by Beck and Teboulle (SIAM J Imaging Sci 2:183–202, 2009) for solving linear inverse problems arising in signal/image processing.
Date: 2020
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-37822-6_5
Ordering information: This item can be ordered from
http://www.springer.com/9783030378226
DOI: 10.1007/978-3-030-37822-6_5
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().