A Gradient Method For Approximating Saddle Points and Constrained Maxima
Kenneth J. Arrow and
Leonid Hurwicz
A chapter in Traces and Emergence of Nonlinear Programming, 2014, pp 45-60 from Springer
Abstract:
Abstract In the following, X and Y will be vectors with components Xi, Yj. By X ≥ 0 will be meant X ≥ 0 for all i. Let g(X), fj(X) (j=1, •••) be functions with suitable differentiability properties, where fj(X)≥0 for all X, and define $$ {\rm F}(\rm X, Y)=g(X)+{\sum_{j=1}^{m}} Y_{j}\Big\{1-[{f_{j}}(X)]^{l+\eta} \Big\}$$ .
Date: 2014
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-0348-0439-4_2
Ordering information: This item can be ordered from
http://www.springer.com/9783034804394
DOI: 10.1007/978-3-0348-0439-4_2
Access Statistics for this chapter
More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().