# Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints

*Bouchta Rhanizar*

*Journal of Mathematics Research*, 2021, vol. 13, issue 2, 90

**Abstract:**
We consider the constrained optimization problem defined by- $$f (x^*) = \min_{x \in X} f(x)\eqno (1)$$ where the function f - \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex on a closed bounded convex set X. To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method. The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.

**JEL-codes:** R00 Z0 (search for similar items in EconPapers)

**Date:** 2021

**References:** View complete reference list from CitEc

**Citations:** Track citations by RSS feed

**Downloads:** (external link)

http://www.ccsenet.org/journal/index.php/jmr/article/download/0/0/44973/47687 (application/pdf)

http://www.ccsenet.org/journal/index.php/jmr/article/view/0/44973 (text/html)

**Related works:**

This item may be available elsewhere in EconPapers: Search for items with the same title.

**Export reference:** BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text

**Persistent link:** https://EconPapers.repec.org/RePEc:ibn:jmrjnl:v:13:y:2021:i:2:p:90

Access Statistics for this article

More articles in Journal of Mathematics Research from Canadian Center of Science and Education Contact information at EDIRC.

Bibliographic data for series maintained by Canadian Center of Science and Education ().