Second-Order Methods
Yurii Nesterov
Additional contact information
Yurii Nesterov: Catholic University of Louvain
Chapter Chapter 4 in Lectures on Convex Optimization, 2018, pp 241-322 from Springer
Abstract:
Abstract In this chapter, we study Black-Box second-order methods. In the first two sections, these methods are based on cubic regularization of the second-order model of the objective function. With an appropriate proximal coefficient, this model becomes a global upper approximation of the objective function. At the same time, the global minimum of this approximation is computable in polynomial time even if the Hessian of the objective is not positive semidefinite. We study global and local convergence of the Cubic Newton Method in convex and non-convex cases. In the next section, we derive the lower complexity bounds and show that this method can be accelerated using the estimating sequences technique. In the last section, we consider a modification of the standard Gauss–Newton method for solving systems of nonlinear equations. This modification is also based on an overestimating principle as applied to the norm of the residual of the system. Both global and local convergence results are justified.
Date: 2018
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-319-91578-4_4
Ordering information: This item can be ordered from
http://www.springer.com/9783319915784
DOI: 10.1007/978-3-319-91578-4_4
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().