A Regularized Newton Method with Correction for Unconstrained Nonconvex Optimization
- Heng Wang
- Mei Qin
Abstract
In this paper, we present a modified regularized Newton method for minimizing a nonconvex function whose Hessian matrix may be singular. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the method has a global convergence property. Under the local error bound condition which is weaker than nonsingularity, the method has cubic convergence.- Full Text: PDF
- DOI:10.5539/jmr.v7n2p7
This work is licensed under a Creative Commons Attribution 4.0 License.
Index
- Academic Journals Database
- ACNP
- Aerospace Database
- BASE (Bielefeld Academic Search Engine)
- Civil Engineering Abstracts
- CNKI Scholar
- COPAC
- DTU Library
- EconPapers
- Elektronische Zeitschriftenbibliothek (EZB)
- EuroPub Database
- Google Scholar
- Harvard Library
- IDEAS
- Infotrieve
- JournalTOCs
- LOCKSS
- MathGuide
- MathSciNet
- MIAR
- PKP Open Archives Harvester
- Publons
- RePEc
- ResearchGate
- Scilit
- SHERPA/RoMEO
- SocioRePEc
- Standard Periodical Directory
- Technische Informationsbibliothek (TIB)
- The Keepers Registry
- UCR Library
- Universe Digital Library
- WorldCat
Contact
- Sophia WangEditorial Assistant
- jmr@ccsenet.org