Introduction: Overview of Unconstrained Optimization
Neculai Andrei ()
Additional contact information
Neculai Andrei: Academy of Romanian Scientists
Chapter Chapter 1 in Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, 2020, pp 1-66 from Springer
Abstract:
Abstract Unconstrained optimization consists of minimizing a function which depends on a number of real variables without any restrictions on the values of these variables. When the number of variables is large, this problem becomes quite challenging. The most important gradient methods for solving unconstrained optimization problems are described in this chapter. These methods are iterative.
Date: 2020
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-42950-8_1
Ordering information: This item can be ordered from
http://www.springer.com/9783030429508
DOI: 10.1007/978-3-030-42950-8_1
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().