Superlinear Convergence of a Modified Newton's Method for Convex Optimization Problems With Constraints
- Bouchta RHANIZAR
Abstract
We consider the constrained optimization problem defined by:
$$f (x^*) = \min_{x \in X} f(x)\eqno (1)$$
where the function f : \pmb{\mathbb{R}}^{n} → \pmb{\mathbb{R}} is convex on a closed bounded convex set X.
To solve problem (1), most methods transform this problem into a problem without constraints, either by introducing Lagrange multipliers or a projection method.
The purpose of this paper is to give a new method to solve some constrained optimization problems, based on the definition of a descent direction and a step while remaining in the X convex domain. A convergence theorem is proven. The paper ends with some numerical examples.
- Full Text: PDF
- DOI:10.5539/jmr.v13n2p90
This work is licensed under a Creative Commons Attribution 4.0 License.
Journal Metrics
- h-index (December 2021): 22
- i10-index (December 2021): 78
- h5-index (December 2021): N/A
- h5-median (December 2021): N/A
( The data was calculated based on Google Scholar Citations. Click Here to Learn More. )
Index
- Academic Journals Database
- ACNP
- Aerospace Database
- BASE (Bielefeld Academic Search Engine)
- Civil Engineering Abstracts
- CNKI Scholar
- COPAC
- DTU Library
- EconPapers
- Elektronische Zeitschriftenbibliothek (EZB)
- EuroPub Database
- Google Scholar
- Harvard Library
- IDEAS
- Infotrieve
- JournalTOCs
- LOCKSS
- MathGuide
- MathSciNet
- MIAR
- PKP Open Archives Harvester
- Publons
- RePEc
- ResearchGate
- Scilit
- SHERPA/RoMEO
- SocioRePEc
- Standard Periodical Directory
- Technische Informationsbibliothek (TIB)
- The Keepers Registry
- UCR Library
- Universe Digital Library
- WorldCat
Contact
- Sophia WangEditorial Assistant
- jmr@ccsenet.org