Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints


  •  Bouchta RHANIZAR    

Abstract

We consider the constrained optimization problem  defined by:
$$f (x^*) = \min_{x \in  X} f(x)\eqno (1)$$

where the function  f : \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex  on a closed bounded convex set X.
To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method.
The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.



This work is licensed under a Creative Commons Attribution 4.0 License.