Alternance Form of Optimality Conditions in the Finite-Dimensional Space
V. F. Demyanov () and
V. N. Malozemov ()
Additional contact information
V. F. Demyanov: Saint Petersburg State University
V. N. Malozemov: Saint Petersburg State University
A chapter in Constructive Nonsmooth Analysis and Related Topics, 2014, pp 185-203 from Springer
Abstract:
Abstract In solving optimization problems, necessary and sufficient optimality conditions play an outstanding role. They allow, first of all, to check whether a point under study satisfies the conditions, and, secondly, if it does not, to find a “better” point. This is why such conditions should be “constructive” letting to solve the above-mentioned problems. For the class of directionally differentiable functions in IR n , a necessary condition for an unconstrained minimum requires for the directional derivative to be non-negative in all directions. This condition becomes efficient for special classes of directionally differentiable functions. For example, in the case of convex and max-type functions, the necessary condition for a minimum takes the form 0 n ∈ C where C ⊂ IR n is a convex compact set. The problem of verifying this condition is reduced to that of finding the point of C which is the nearest to the origin. If the origin does not belong to C, we easily find the steepest descent direction and are able to construct a numerical method. For the classical Chebyshev approximation problem (the problem of approximating a function f(t): G → IR by a polynomial P(t)), the condition for a minimum takes the so-called alternance form: for a polynomial P ∗(t) to be a solution to the Chebyshev approximation problem, a collection of points $$\{t_{i}\ \vert \ t_{i} \in G\}$$ should exist at which the difference P ∗(t) − f(t) attains its maximal absolute value with alternating signs. This condition can easily be verified, and if it does not hold, one can find a “better” polynomial. In the present paper, it will be demonstrated that the alternance form of the necessary conditions is valid not only for Chebyshev approximation problems but also in the general case of directionally differentiable functions. Here only unconstrained optimization problems are discussed. In many cases a constrained optimization problem can be reduced (via Exact Penalization Techniques) to an unconstrained one. In the paper, optimality conditions are first formulated in terms of directional derivatives. Next, the notions of upper and lower exhausters are introduced, and optimality conditions are stated by means of upper and lower exhausters. In all these cases the optimality conditions are presented in the form 0 n ∈ C where C ⊂ IR n is a convex closed bounded set (or a family of such sets). It is proved that the condition 0 n ∈ C can be formulated in the alternance form. The result obtained is applied to deduce the well-known Chebyshev alternation rule in the problem of Chebyshev approximation of a function by a polynomial. The problem of Chebyshev approximation of several functions by a polynomial is also discussed, and optimality conditions are stated in the alternance form.
Keywords: Necessary optimality conditions; Alternance form; Directionally differentiable functions (search for similar items in EconPapers)
Date: 2014
References: Add references at CitEc
Citations: View citations in EconPapers (1)
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-1-4614-8615-2_12
Ordering information: This item can be ordered from
http://www.springer.com/9781461486152
DOI: 10.1007/978-1-4614-8615-2_12
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().