Economics at your fingertips  

Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints

Bouchta Rhanizar

Journal of Mathematics Research, 2021, vol. 13, issue 2, 90

Abstract: We consider the constrained optimization problem defined by- $$f (x^*) = \min_{x \in X} f(x)\eqno (1)$$ where the function f - \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex on a closed bounded convex set X. To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method. The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.

JEL-codes: R00 Z0 (search for similar items in EconPapers)
Date: 2021
References: View complete reference list from CitEc
Citations: Track citations by RSS feed

Downloads: (external link) (application/pdf) (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link:

Access Statistics for this article

More articles in Journal of Mathematics Research from Canadian Center of Science and Education Contact information at EDIRC.
Bibliographic data for series maintained by Canadian Center of Science and Education ().

Page updated 2021-04-24
Handle: RePEc:ibn:jmrjnl:v:13:y:2021:i:2:p:90