EconPapers    
Economics at your fingertips  
 

Newton-Type Methods: A Broader View

A. F. Izmailov () and M. V. Solodov ()
Additional contact information
A. F. Izmailov: Moscow State University, MSU, OR Department, VMK Faculty
M. V. Solodov: IMPA – Instituto de Matemática Pura e Aplicada

Journal of Optimization Theory and Applications, 2015, vol. 164, issue 2, No 11, 577-620

Abstract: Abstract We discuss the question of which features and/or properties make a method for solving a given problem belong to the “Newtonian class.” Is it the strategy of linearization (or perhaps, second-order approximation) of the problem data (maybe only part of the problem data)? Or is it fast local convergence of the method under natural assumptions and at a reasonable computational cost of its iteration? We consider both points of view, and also how they relate to each other. In particular, we discuss abstract Newtonian frameworks for generalized equations, and how a number of important algorithms for constrained optimization can be related to them by introducing structured perturbations to the basic Newton iteration. This gives useful tools for local convergence and rate-of-convergence analysis of various algorithms from unified perspectives, often yielding sharper results than provided by other approaches. Specific constrained optimization algorithms, which can be conveniently analyzed within perturbed Newtonian frameworks, include the sequential quadratic programming method and its various modifications (truncated, augmented Lagrangian, composite step, stabilized, and equipped with second-order corrections), the linearly constrained Lagrangian methods, inexact restoration, sequential quadratically constrained quadratic programming, and certain interior feasible directions methods. We recall most of those algorithms as examples to illustrate the underlying viewpoint. We also discuss how the main ideas of this approach go beyond clearly Newton-related methods and are applicable, for example, to the augmented Lagrangian algorithm (also known as the method of multipliers), which is in principle not of Newton type since its iterations do not approximate any part of the problem data.

Keywords: Newton method; Superlinear convergence; Generalized equation; (Perturbed) Josephy–Newton framework; Constrained optimization; (Perturbed) sequential quadratic programming framework; Augmented Lagrangian; 65K05; 65K10; 65K15; 90C30; 90C33; 90C53 (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (7)

Downloads: (external link)
http://link.springer.com/10.1007/s10957-014-0580-0 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:joptap:v:164:y:2015:i:2:d:10.1007_s10957-014-0580-0

Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10957/PS2

DOI: 10.1007/s10957-014-0580-0

Access Statistics for this article

Journal of Optimization Theory and Applications is currently edited by Franco Giannessi and David G. Hull

More articles in Journal of Optimization Theory and Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:joptap:v:164:y:2015:i:2:d:10.1007_s10957-014-0580-0