Nonlinear Optimization
Yurii Nesterov
Additional contact information
Yurii Nesterov: Catholic University of Louvain
Chapter Chapter 1 in Lectures on Convex Optimization, 2018, pp 3-58 from Springer
Abstract:
Abstract In this chapter, we introduce the main notations and concepts used in Continuous Optimization. The first theoretical results are related to Complexity Analysis of the problems of Global Optimization. For these problems, we start with a very pessimistic lower performance guarantee. It implies that for any method there exists an optimization problem in โ n $$\mathbb {R}^n$$ which needs at least O 1 ๐ n $$O\left ({1 \over \epsilon ^n}\right )$$ computations of the function values in order to approximate its global solution up to accuracy ๐. Therefore, in the next section we pass to local optimization, and consider two main methods, the Gradient Method and the Newton Method. For both of them, we establish some local rates of convergence. In the last section, we present some standard methods in General Nonlinear Optimization: the conjugate gradient methods, quasi-Newton methods, theory of Lagrangian relaxation, barrier methods and penalty function methods. For some of them, we prove global convergence results.
Date: 2018
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-319-91578-4_1
Ordering information: This item can be ordered from
http://www.springer.com/9783319915784
DOI: 10.1007/978-3-319-91578-4_1
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().