Gradient Algorithm with a Smooth Objective Function
Alexander Zaslavski
Additional contact information
Alexander Zaslavski: Israel Institute of Technology
Chapter Chapter 4 in Convex Optimization with Computational Errors, 2020, pp 127-150 from Springer
Abstract:
Abstract In this chapter we analyze the convergence of a projected gradient algorithm with a smooth objective function under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In general, these two computational errors are different.
Date: 2020
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-37822-6_4
Ordering information: This item can be ordered from
http://www.springer.com/9783030378226
DOI: 10.1007/978-3-030-37822-6_4
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().