EconPapers    
Economics at your fingertips  
 

Steepest Descent Methods

Neculai Andrei ()
Additional contact information
Neculai Andrei: Center for Advanced Modeling and Optimization

Chapter 3 in Modern Numerical Nonlinear Optimization, 2022, pp 81-107 from Springer

Abstract: Abstract The steepest descent method was designed by Cauchy (1847) and is the simplest of the gradient methods for the optimization of general continuously differential functions in n variables. Its importance is due to the fact that it gives the fundamental ideas and concepts of all unconstrained optimization methods. It introduces a pattern common to many optimization methods. In this pattern, an iteration consists of two parts: the choice of a descent search direction dk followed at once by a line search to find a suitable stepsize αk. The search direction in the steepest descent method is exactly the negative gradient.

Date: 2022
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-031-08720-2_3

Ordering information: This item can be ordered from
http://www.springer.com/9783031087202

DOI: 10.1007/978-3-031-08720-2_3

Access Statistics for this chapter

More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-01
Handle: RePEc:spr:spochp:978-3-031-08720-2_3